Jan 27 19:57:38 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 27 19:57:38 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 27 19:57:38 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 27 19:57:38 localhost kernel: BIOS-provided physical RAM map:
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 27 19:57:38 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 27 19:57:38 localhost kernel: NX (Execute Disable) protection: active
Jan 27 19:57:38 localhost kernel: APIC: Static calls initialized
Jan 27 19:57:38 localhost kernel: SMBIOS 2.8 present.
Jan 27 19:57:38 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 27 19:57:38 localhost kernel: Hypervisor detected: KVM
Jan 27 19:57:38 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 27 19:57:38 localhost kernel: kvm-clock: using sched offset of 6130591990 cycles
Jan 27 19:57:38 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 27 19:57:38 localhost kernel: tsc: Detected 2800.000 MHz processor
Jan 27 19:57:38 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 27 19:57:38 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 27 19:57:38 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 27 19:57:38 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 27 19:57:38 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 27 19:57:38 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 27 19:57:38 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 27 19:57:38 localhost kernel: Using GB pages for direct mapping
Jan 27 19:57:38 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 27 19:57:38 localhost kernel: ACPI: Early table checksum verification disabled
Jan 27 19:57:38 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 27 19:57:38 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 27 19:57:38 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 27 19:57:38 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 27 19:57:38 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 27 19:57:38 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 27 19:57:38 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 27 19:57:38 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 27 19:57:38 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 27 19:57:38 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 27 19:57:38 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 27 19:57:38 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 27 19:57:38 localhost kernel: No NUMA configuration found
Jan 27 19:57:38 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 27 19:57:38 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 27 19:57:38 localhost kernel: crashkernel reserved: 0x00000000a8000000 - 0x00000000b8000000 (256 MB)
Jan 27 19:57:38 localhost kernel: Zone ranges:
Jan 27 19:57:38 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 27 19:57:38 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 27 19:57:38 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 27 19:57:38 localhost kernel:   Device   empty
Jan 27 19:57:38 localhost kernel: Movable zone start for each node
Jan 27 19:57:38 localhost kernel: Early memory node ranges
Jan 27 19:57:38 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 27 19:57:38 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 27 19:57:38 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 27 19:57:38 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 27 19:57:38 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 27 19:57:38 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 27 19:57:38 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 27 19:57:38 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 27 19:57:38 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 27 19:57:38 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 27 19:57:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 27 19:57:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 27 19:57:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 27 19:57:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 27 19:57:38 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 27 19:57:38 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 27 19:57:38 localhost kernel: TSC deadline timer available
Jan 27 19:57:38 localhost kernel: CPU topo: Max. logical packages:   8
Jan 27 19:57:38 localhost kernel: CPU topo: Max. logical dies:       8
Jan 27 19:57:38 localhost kernel: CPU topo: Max. dies per package:   1
Jan 27 19:57:38 localhost kernel: CPU topo: Max. threads per core:   1
Jan 27 19:57:38 localhost kernel: CPU topo: Num. cores per package:     1
Jan 27 19:57:38 localhost kernel: CPU topo: Num. threads per package:   1
Jan 27 19:57:38 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 27 19:57:38 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 27 19:57:38 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 27 19:57:38 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 27 19:57:38 localhost kernel: Booting paravirtualized kernel on KVM
Jan 27 19:57:38 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 27 19:57:38 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 27 19:57:38 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 27 19:57:38 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 27 19:57:38 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 27 19:57:38 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 27 19:57:38 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 27 19:57:38 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 27 19:57:38 localhost kernel: random: crng init done
Jan 27 19:57:38 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 27 19:57:38 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 27 19:57:38 localhost kernel: Fallback order for Node 0: 0 
Jan 27 19:57:38 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 27 19:57:38 localhost kernel: Policy zone: Normal
Jan 27 19:57:38 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 27 19:57:38 localhost kernel: software IO TLB: area num 8.
Jan 27 19:57:38 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 27 19:57:38 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 27 19:57:38 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 27 19:57:38 localhost kernel: Dynamic Preempt: voluntary
Jan 27 19:57:38 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 27 19:57:38 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 27 19:57:38 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 27 19:57:38 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 27 19:57:38 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 27 19:57:38 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 27 19:57:38 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 27 19:57:38 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 27 19:57:38 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 27 19:57:38 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 27 19:57:38 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 27 19:57:38 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 27 19:57:38 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 27 19:57:38 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 27 19:57:38 localhost kernel: Console: colour VGA+ 80x25
Jan 27 19:57:38 localhost kernel: printk: console [ttyS0] enabled
Jan 27 19:57:38 localhost kernel: ACPI: Core revision 20230331
Jan 27 19:57:38 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 27 19:57:38 localhost kernel: x2apic enabled
Jan 27 19:57:38 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 27 19:57:38 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 27 19:57:38 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 27 19:57:38 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 27 19:57:38 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 27 19:57:38 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 27 19:57:38 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 27 19:57:38 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 27 19:57:38 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 27 19:57:38 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 27 19:57:38 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 27 19:57:38 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 27 19:57:38 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 27 19:57:38 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 27 19:57:38 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 27 19:57:38 localhost kernel: x86/bugs: return thunk changed
Jan 27 19:57:38 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 27 19:57:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 27 19:57:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 27 19:57:38 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 27 19:57:38 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 27 19:57:38 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 27 19:57:38 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 27 19:57:38 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 27 19:57:38 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 27 19:57:38 localhost kernel: landlock: Up and running.
Jan 27 19:57:38 localhost kernel: Yama: becoming mindful.
Jan 27 19:57:38 localhost kernel: SELinux:  Initializing.
Jan 27 19:57:38 localhost kernel: LSM support for eBPF active
Jan 27 19:57:38 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 27 19:57:38 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 27 19:57:38 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 27 19:57:38 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 27 19:57:38 localhost kernel: ... version:                0
Jan 27 19:57:38 localhost kernel: ... bit width:              48
Jan 27 19:57:38 localhost kernel: ... generic registers:      6
Jan 27 19:57:38 localhost kernel: ... value mask:             0000ffffffffffff
Jan 27 19:57:38 localhost kernel: ... max period:             00007fffffffffff
Jan 27 19:57:38 localhost kernel: ... fixed-purpose events:   0
Jan 27 19:57:38 localhost kernel: ... event mask:             000000000000003f
Jan 27 19:57:38 localhost kernel: signal: max sigframe size: 1776
Jan 27 19:57:38 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 27 19:57:38 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 27 19:57:38 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 27 19:57:38 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 27 19:57:38 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 27 19:57:38 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 27 19:57:38 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 27 19:57:38 localhost kernel: node 0 deferred pages initialised in 22ms
Jan 27 19:57:38 localhost kernel: Memory: 7763768K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618364K reserved, 0K cma-reserved)
Jan 27 19:57:38 localhost kernel: devtmpfs: initialized
Jan 27 19:57:38 localhost kernel: x86/mm: Memory block size: 128MB
Jan 27 19:57:38 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 27 19:57:38 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 27 19:57:38 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 27 19:57:38 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 27 19:57:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 27 19:57:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 27 19:57:38 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 27 19:57:38 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 27 19:57:38 localhost kernel: audit: type=2000 audit(1769543857.117:1): state=initialized audit_enabled=0 res=1
Jan 27 19:57:38 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 27 19:57:38 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 27 19:57:38 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 27 19:57:38 localhost kernel: cpuidle: using governor menu
Jan 27 19:57:38 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 27 19:57:38 localhost kernel: PCI: Using configuration type 1 for base access
Jan 27 19:57:38 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 27 19:57:38 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 27 19:57:38 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 27 19:57:38 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 27 19:57:38 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 27 19:57:38 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 27 19:57:38 localhost kernel: Demotion targets for Node 0: null
Jan 27 19:57:38 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 27 19:57:38 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 27 19:57:38 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 27 19:57:38 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 27 19:57:38 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 27 19:57:38 localhost kernel: ACPI: Interpreter enabled
Jan 27 19:57:38 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 27 19:57:38 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 27 19:57:38 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 27 19:57:38 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 27 19:57:38 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 27 19:57:38 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 27 19:57:38 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [3] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [4] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [5] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [6] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [7] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [8] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [9] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [10] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [11] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [12] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [13] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [14] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [15] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [16] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [17] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [18] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [19] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [20] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [21] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [22] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [23] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [24] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [25] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [26] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [27] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [28] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [29] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [30] registered
Jan 27 19:57:38 localhost kernel: acpiphp: Slot [31] registered
Jan 27 19:57:38 localhost kernel: PCI host bridge to bus 0000:00
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 27 19:57:38 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 27 19:57:38 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 27 19:57:38 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 27 19:57:38 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 27 19:57:38 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 27 19:57:38 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 27 19:57:38 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 27 19:57:38 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 27 19:57:38 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 27 19:57:38 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 27 19:57:38 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 27 19:57:38 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 27 19:57:38 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 27 19:57:38 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 27 19:57:38 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 27 19:57:38 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 27 19:57:38 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 27 19:57:38 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 27 19:57:38 localhost kernel: iommu: Default domain type: Translated
Jan 27 19:57:38 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 27 19:57:38 localhost kernel: SCSI subsystem initialized
Jan 27 19:57:38 localhost kernel: ACPI: bus type USB registered
Jan 27 19:57:38 localhost kernel: usbcore: registered new interface driver usbfs
Jan 27 19:57:38 localhost kernel: usbcore: registered new interface driver hub
Jan 27 19:57:38 localhost kernel: usbcore: registered new device driver usb
Jan 27 19:57:38 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 27 19:57:38 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 27 19:57:38 localhost kernel: PTP clock support registered
Jan 27 19:57:38 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 27 19:57:38 localhost kernel: NetLabel: Initializing
Jan 27 19:57:38 localhost kernel: NetLabel:  domain hash size = 128
Jan 27 19:57:38 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 27 19:57:38 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 27 19:57:38 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 27 19:57:38 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 27 19:57:38 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 27 19:57:38 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 27 19:57:38 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 27 19:57:38 localhost kernel: vgaarb: loaded
Jan 27 19:57:38 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 27 19:57:38 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 27 19:57:38 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 27 19:57:38 localhost kernel: pnp: PnP ACPI init
Jan 27 19:57:38 localhost kernel: pnp 00:03: [dma 2]
Jan 27 19:57:38 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 27 19:57:38 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 27 19:57:38 localhost kernel: NET: Registered PF_INET protocol family
Jan 27 19:57:38 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 27 19:57:38 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 27 19:57:38 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 27 19:57:38 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 27 19:57:38 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 27 19:57:38 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 27 19:57:38 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 27 19:57:38 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 27 19:57:38 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 27 19:57:38 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 27 19:57:38 localhost kernel: NET: Registered PF_XDP protocol family
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 27 19:57:38 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 27 19:57:38 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 27 19:57:38 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 27 19:57:38 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 72745 usecs
Jan 27 19:57:38 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 27 19:57:38 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 27 19:57:38 localhost kernel: software IO TLB: mapped [mem 0x00000000bbfdb000-0x00000000bffdb000] (64MB)
Jan 27 19:57:38 localhost kernel: ACPI: bus type thunderbolt registered
Jan 27 19:57:38 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 27 19:57:38 localhost kernel: Initialise system trusted keyrings
Jan 27 19:57:38 localhost kernel: Key type blacklist registered
Jan 27 19:57:38 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 27 19:57:38 localhost kernel: zbud: loaded
Jan 27 19:57:38 localhost kernel: integrity: Platform Keyring initialized
Jan 27 19:57:38 localhost kernel: integrity: Machine keyring initialized
Jan 27 19:57:38 localhost kernel: Freeing initrd memory: 87956K
Jan 27 19:57:38 localhost kernel: NET: Registered PF_ALG protocol family
Jan 27 19:57:38 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 27 19:57:38 localhost kernel: Key type asymmetric registered
Jan 27 19:57:38 localhost kernel: Asymmetric key parser 'x509' registered
Jan 27 19:57:38 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 27 19:57:38 localhost kernel: io scheduler mq-deadline registered
Jan 27 19:57:38 localhost kernel: io scheduler kyber registered
Jan 27 19:57:38 localhost kernel: io scheduler bfq registered
Jan 27 19:57:38 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 27 19:57:38 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 27 19:57:38 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 27 19:57:38 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 27 19:57:38 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 27 19:57:38 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 27 19:57:38 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 27 19:57:38 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 27 19:57:38 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 27 19:57:38 localhost kernel: Non-volatile memory driver v1.3
Jan 27 19:57:38 localhost kernel: rdac: device handler registered
Jan 27 19:57:38 localhost kernel: hp_sw: device handler registered
Jan 27 19:57:38 localhost kernel: emc: device handler registered
Jan 27 19:57:38 localhost kernel: alua: device handler registered
Jan 27 19:57:38 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 27 19:57:38 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 27 19:57:38 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 27 19:57:38 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 27 19:57:38 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 27 19:57:38 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 27 19:57:38 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 27 19:57:38 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 27 19:57:38 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 27 19:57:38 localhost kernel: hub 1-0:1.0: USB hub found
Jan 27 19:57:38 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 27 19:57:38 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 27 19:57:38 localhost kernel: usbserial: USB Serial support registered for generic
Jan 27 19:57:38 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 27 19:57:38 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 27 19:57:38 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 27 19:57:38 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 27 19:57:38 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 27 19:57:38 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 27 19:57:38 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 27 19:57:38 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 27 19:57:38 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-27T19:57:37 UTC (1769543857)
Jan 27 19:57:38 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 27 19:57:38 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 27 19:57:38 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 27 19:57:38 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 27 19:57:38 localhost kernel: usbcore: registered new interface driver usbhid
Jan 27 19:57:38 localhost kernel: usbhid: USB HID core driver
Jan 27 19:57:38 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 27 19:57:38 localhost kernel: Initializing XFRM netlink socket
Jan 27 19:57:38 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 27 19:57:38 localhost kernel: Segment Routing with IPv6
Jan 27 19:57:38 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 27 19:57:38 localhost kernel: mpls_gso: MPLS GSO support
Jan 27 19:57:38 localhost kernel: IPI shorthand broadcast: enabled
Jan 27 19:57:38 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 27 19:57:38 localhost kernel: AES CTR mode by8 optimization enabled
Jan 27 19:57:38 localhost kernel: sched_clock: Marking stable (1626010739, 150111640)->(1898732099, -122609720)
Jan 27 19:57:38 localhost kernel: registered taskstats version 1
Jan 27 19:57:38 localhost kernel: Loading compiled-in X.509 certificates
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 27 19:57:38 localhost kernel: Demotion targets for Node 0: null
Jan 27 19:57:38 localhost kernel: page_owner is disabled
Jan 27 19:57:38 localhost kernel: Key type .fscrypt registered
Jan 27 19:57:38 localhost kernel: Key type fscrypt-provisioning registered
Jan 27 19:57:38 localhost kernel: Key type big_key registered
Jan 27 19:57:38 localhost kernel: Key type encrypted registered
Jan 27 19:57:38 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 27 19:57:38 localhost kernel: Loading compiled-in module X.509 certificates
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 27 19:57:38 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 27 19:57:38 localhost kernel: ima: No architecture policies found
Jan 27 19:57:38 localhost kernel: evm: Initialising EVM extended attributes:
Jan 27 19:57:38 localhost kernel: evm: security.selinux
Jan 27 19:57:38 localhost kernel: evm: security.SMACK64 (disabled)
Jan 27 19:57:38 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 27 19:57:38 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 27 19:57:38 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 27 19:57:38 localhost kernel: evm: security.apparmor (disabled)
Jan 27 19:57:38 localhost kernel: evm: security.ima
Jan 27 19:57:38 localhost kernel: evm: security.capability
Jan 27 19:57:38 localhost kernel: evm: HMAC attrs: 0x1
Jan 27 19:57:38 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 27 19:57:38 localhost kernel: Running certificate verification RSA selftest
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 27 19:57:38 localhost kernel: Running certificate verification ECDSA selftest
Jan 27 19:57:38 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 27 19:57:38 localhost kernel: clk: Disabling unused clocks
Jan 27 19:57:38 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 27 19:57:38 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 27 19:57:38 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 27 19:57:38 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 27 19:57:38 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 27 19:57:38 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 27 19:57:38 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 27 19:57:38 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 27 19:57:38 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 27 19:57:38 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 27 19:57:38 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 27 19:57:38 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 27 19:57:38 localhost kernel: Run /init as init process
Jan 27 19:57:38 localhost kernel:   with arguments:
Jan 27 19:57:38 localhost kernel:     /init
Jan 27 19:57:38 localhost kernel:   with environment:
Jan 27 19:57:38 localhost kernel:     HOME=/
Jan 27 19:57:38 localhost kernel:     TERM=linux
Jan 27 19:57:38 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 27 19:57:38 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 27 19:57:38 localhost systemd[1]: Detected virtualization kvm.
Jan 27 19:57:38 localhost systemd[1]: Detected architecture x86-64.
Jan 27 19:57:38 localhost systemd[1]: Running in initrd.
Jan 27 19:57:38 localhost systemd[1]: No hostname configured, using default hostname.
Jan 27 19:57:38 localhost systemd[1]: Hostname set to <localhost>.
Jan 27 19:57:38 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 27 19:57:38 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 27 19:57:38 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 27 19:57:38 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 27 19:57:38 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 27 19:57:38 localhost systemd[1]: Reached target Local File Systems.
Jan 27 19:57:38 localhost systemd[1]: Reached target Path Units.
Jan 27 19:57:38 localhost systemd[1]: Reached target Slice Units.
Jan 27 19:57:38 localhost systemd[1]: Reached target Swaps.
Jan 27 19:57:38 localhost systemd[1]: Reached target Timer Units.
Jan 27 19:57:38 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 27 19:57:38 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 27 19:57:38 localhost systemd[1]: Listening on Journal Socket.
Jan 27 19:57:38 localhost systemd[1]: Listening on udev Control Socket.
Jan 27 19:57:38 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 27 19:57:38 localhost systemd[1]: Reached target Socket Units.
Jan 27 19:57:38 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 27 19:57:38 localhost systemd[1]: Starting Journal Service...
Jan 27 19:57:38 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 27 19:57:38 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 27 19:57:38 localhost systemd[1]: Starting Create System Users...
Jan 27 19:57:38 localhost systemd[1]: Starting Setup Virtual Console...
Jan 27 19:57:38 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 27 19:57:38 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 27 19:57:38 localhost systemd-journald[307]: Journal started
Jan 27 19:57:38 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/3b9a1f76d31549d890b4a523eb6cf5fa) is 8.0M, max 153.6M, 145.6M free.
Jan 27 19:57:38 localhost systemd[1]: Started Journal Service.
Jan 27 19:57:38 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Jan 27 19:57:38 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Jan 27 19:57:38 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 27 19:57:38 localhost systemd[1]: Finished Create System Users.
Jan 27 19:57:38 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 27 19:57:38 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 27 19:57:38 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 27 19:57:38 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 27 19:57:38 localhost systemd[1]: Finished Setup Virtual Console.
Jan 27 19:57:38 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 27 19:57:38 localhost systemd[1]: Starting dracut cmdline hook...
Jan 27 19:57:38 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Jan 27 19:57:38 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 27 19:57:38 localhost systemd[1]: Finished dracut cmdline hook.
Jan 27 19:57:38 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 27 19:57:38 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 27 19:57:38 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 27 19:57:38 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 27 19:57:38 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 27 19:57:38 localhost kernel: RPC: Registered udp transport module.
Jan 27 19:57:38 localhost kernel: RPC: Registered tcp transport module.
Jan 27 19:57:38 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 27 19:57:38 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 27 19:57:38 localhost rpc.statd[444]: Version 2.5.4 starting
Jan 27 19:57:38 localhost rpc.statd[444]: Initializing NSM state
Jan 27 19:57:38 localhost rpc.idmapd[449]: Setting log level to 0
Jan 27 19:57:38 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 27 19:57:38 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 27 19:57:38 localhost systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Jan 27 19:57:38 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 27 19:57:38 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 27 19:57:38 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 27 19:57:38 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 27 19:57:38 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 27 19:57:38 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 27 19:57:38 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 27 19:57:38 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 27 19:57:38 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 27 19:57:38 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 27 19:57:38 localhost systemd[1]: Reached target Network.
Jan 27 19:57:38 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 27 19:57:38 localhost systemd[1]: Starting dracut initqueue hook...
Jan 27 19:57:39 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 27 19:57:39 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 27 19:57:39 localhost kernel:  vda: vda1
Jan 27 19:57:39 localhost systemd-udevd[478]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 19:57:39 localhost kernel: libata version 3.00 loaded.
Jan 27 19:57:39 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 27 19:57:39 localhost kernel: scsi host0: ata_piix
Jan 27 19:57:39 localhost kernel: scsi host1: ata_piix
Jan 27 19:57:39 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 27 19:57:39 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 27 19:57:39 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 27 19:57:39 localhost systemd[1]: Reached target Initrd Root Device.
Jan 27 19:57:39 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 27 19:57:39 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 27 19:57:39 localhost systemd[1]: Reached target System Initialization.
Jan 27 19:57:39 localhost systemd[1]: Reached target Basic System.
Jan 27 19:57:39 localhost kernel: ata1: found unknown device (class 0)
Jan 27 19:57:39 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 27 19:57:39 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 27 19:57:39 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 27 19:57:39 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 27 19:57:39 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 27 19:57:39 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 27 19:57:39 localhost systemd[1]: Finished dracut initqueue hook.
Jan 27 19:57:39 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 27 19:57:39 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 27 19:57:39 localhost systemd[1]: Reached target Remote File Systems.
Jan 27 19:57:39 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 27 19:57:39 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 27 19:57:39 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 27 19:57:39 localhost systemd-fsck[553]: /usr/sbin/fsck.xfs: XFS file system.
Jan 27 19:57:39 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 27 19:57:39 localhost systemd[1]: Mounting /sysroot...
Jan 27 19:57:39 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 27 19:57:40 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 27 19:57:40 localhost kernel: XFS (vda1): Ending clean mount
Jan 27 19:57:40 localhost systemd[1]: Mounted /sysroot.
Jan 27 19:57:40 localhost systemd[1]: Reached target Initrd Root File System.
Jan 27 19:57:40 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 27 19:57:40 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 27 19:57:40 localhost systemd[1]: Reached target Initrd File Systems.
Jan 27 19:57:40 localhost systemd[1]: Reached target Initrd Default Target.
Jan 27 19:57:40 localhost systemd[1]: Starting dracut mount hook...
Jan 27 19:57:40 localhost systemd[1]: Finished dracut mount hook.
Jan 27 19:57:40 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 27 19:57:40 localhost rpc.idmapd[449]: exiting on signal 15
Jan 27 19:57:40 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 27 19:57:40 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 27 19:57:40 localhost systemd[1]: Stopped target Network.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Timer Units.
Jan 27 19:57:40 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 27 19:57:40 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Basic System.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Path Units.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Remote File Systems.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Slice Units.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Socket Units.
Jan 27 19:57:40 localhost systemd[1]: Stopped target System Initialization.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Local File Systems.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Swaps.
Jan 27 19:57:40 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped dracut mount hook.
Jan 27 19:57:40 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 27 19:57:40 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 27 19:57:40 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 27 19:57:40 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 27 19:57:40 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 27 19:57:40 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 27 19:57:40 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 27 19:57:40 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 27 19:57:40 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 27 19:57:40 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 27 19:57:40 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 27 19:57:40 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 27 19:57:40 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Closed udev Control Socket.
Jan 27 19:57:40 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Closed udev Kernel Socket.
Jan 27 19:57:40 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 27 19:57:40 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 27 19:57:40 localhost systemd[1]: Starting Cleanup udev Database...
Jan 27 19:57:40 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 27 19:57:40 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 27 19:57:40 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Stopped Create System Users.
Jan 27 19:57:40 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 27 19:57:40 localhost systemd[1]: Finished Cleanup udev Database.
Jan 27 19:57:40 localhost systemd[1]: Reached target Switch Root.
Jan 27 19:57:40 localhost systemd[1]: Starting Switch Root...
Jan 27 19:57:40 localhost systemd[1]: Switching root.
Jan 27 19:57:40 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Jan 27 19:57:40 localhost systemd-journald[307]: Journal stopped
Jan 27 19:57:41 localhost kernel: audit: type=1404 audit(1769543860.547:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 27 19:57:41 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 19:57:41 localhost kernel: SELinux:  policy capability open_perms=1
Jan 27 19:57:41 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 19:57:41 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 27 19:57:41 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 19:57:41 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 19:57:41 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 19:57:41 localhost kernel: audit: type=1403 audit(1769543860.722:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 27 19:57:41 localhost systemd[1]: Successfully loaded SELinux policy in 182.709ms.
Jan 27 19:57:41 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 37.813ms.
Jan 27 19:57:41 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 27 19:57:41 localhost systemd[1]: Detected virtualization kvm.
Jan 27 19:57:41 localhost systemd[1]: Detected architecture x86-64.
Jan 27 19:57:41 localhost systemd-rc-local-generator[635]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 19:57:41 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 27 19:57:41 localhost systemd[1]: Stopped Switch Root.
Jan 27 19:57:41 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 27 19:57:41 localhost systemd[1]: Created slice Slice /system/getty.
Jan 27 19:57:41 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 27 19:57:41 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 27 19:57:41 localhost systemd[1]: Created slice User and Session Slice.
Jan 27 19:57:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 27 19:57:41 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 27 19:57:41 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 27 19:57:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 27 19:57:41 localhost systemd[1]: Stopped target Switch Root.
Jan 27 19:57:41 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 27 19:57:41 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 27 19:57:41 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 27 19:57:41 localhost systemd[1]: Reached target Path Units.
Jan 27 19:57:41 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 27 19:57:41 localhost systemd[1]: Reached target Slice Units.
Jan 27 19:57:41 localhost systemd[1]: Reached target Swaps.
Jan 27 19:57:41 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 27 19:57:41 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 27 19:57:41 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 27 19:57:41 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 27 19:57:41 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 27 19:57:41 localhost systemd[1]: Listening on udev Control Socket.
Jan 27 19:57:41 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 27 19:57:41 localhost systemd[1]: Mounting Huge Pages File System...
Jan 27 19:57:41 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 27 19:57:41 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 27 19:57:41 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 27 19:57:41 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 27 19:57:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 27 19:57:41 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 27 19:57:41 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 27 19:57:41 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 27 19:57:41 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 27 19:57:41 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 27 19:57:41 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 27 19:57:41 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 27 19:57:41 localhost systemd[1]: Stopped Journal Service.
Jan 27 19:57:41 localhost systemd[1]: Starting Journal Service...
Jan 27 19:57:41 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 27 19:57:41 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 27 19:57:41 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 27 19:57:41 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 27 19:57:41 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 27 19:57:41 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 27 19:57:41 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 27 19:57:41 localhost kernel: fuse: init (API version 7.37)
Jan 27 19:57:41 localhost systemd[1]: Mounted Huge Pages File System.
Jan 27 19:57:41 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 27 19:57:41 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 27 19:57:41 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 27 19:57:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 27 19:57:41 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 27 19:57:41 localhost systemd-journald[677]: Journal started
Jan 27 19:57:41 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 27 19:57:41 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 27 19:57:41 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 27 19:57:41 localhost systemd[1]: Started Journal Service.
Jan 27 19:57:41 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 27 19:57:41 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 27 19:57:41 localhost kernel: ACPI: bus type drm_connector registered
Jan 27 19:57:41 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 27 19:57:41 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 27 19:57:41 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 27 19:57:41 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 27 19:57:41 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 27 19:57:41 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 27 19:57:41 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 27 19:57:41 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 27 19:57:41 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 27 19:57:41 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 27 19:57:41 localhost systemd[1]: Mounting FUSE Control File System...
Jan 27 19:57:41 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 27 19:57:41 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 27 19:57:41 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 27 19:57:41 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 27 19:57:41 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 27 19:57:41 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 27 19:57:41 localhost systemd-journald[677]: Received client request to flush runtime journal.
Jan 27 19:57:41 localhost systemd[1]: Starting Create System Users...
Jan 27 19:57:41 localhost systemd[1]: Mounted FUSE Control File System.
Jan 27 19:57:41 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 27 19:57:41 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 27 19:57:41 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 27 19:57:41 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 27 19:57:41 localhost systemd[1]: Finished Create System Users.
Jan 27 19:57:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 27 19:57:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 27 19:57:41 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 27 19:57:41 localhost systemd[1]: Reached target Local File Systems.
Jan 27 19:57:41 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 27 19:57:41 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 27 19:57:41 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 27 19:57:41 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 27 19:57:41 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 27 19:57:41 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 27 19:57:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 27 19:57:41 localhost bootctl[696]: Couldn't find EFI system partition, skipping.
Jan 27 19:57:41 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 27 19:57:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 27 19:57:41 localhost systemd[1]: Starting Security Auditing Service...
Jan 27 19:57:41 localhost systemd[1]: Starting RPC Bind...
Jan 27 19:57:41 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 27 19:57:41 localhost auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 27 19:57:41 localhost auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 27 19:57:41 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 27 19:57:41 localhost systemd[1]: Started RPC Bind.
Jan 27 19:57:41 localhost augenrules[707]: /sbin/augenrules: No change
Jan 27 19:57:41 localhost augenrules[722]: No rules
Jan 27 19:57:41 localhost augenrules[722]: enabled 1
Jan 27 19:57:41 localhost augenrules[722]: failure 1
Jan 27 19:57:41 localhost augenrules[722]: pid 702
Jan 27 19:57:41 localhost augenrules[722]: rate_limit 0
Jan 27 19:57:41 localhost augenrules[722]: backlog_limit 8192
Jan 27 19:57:41 localhost augenrules[722]: lost 0
Jan 27 19:57:41 localhost augenrules[722]: backlog 3
Jan 27 19:57:41 localhost augenrules[722]: backlog_wait_time 60000
Jan 27 19:57:41 localhost augenrules[722]: backlog_wait_time_actual 0
Jan 27 19:57:41 localhost augenrules[722]: enabled 1
Jan 27 19:57:41 localhost augenrules[722]: failure 1
Jan 27 19:57:41 localhost augenrules[722]: pid 702
Jan 27 19:57:41 localhost augenrules[722]: rate_limit 0
Jan 27 19:57:41 localhost augenrules[722]: backlog_limit 8192
Jan 27 19:57:41 localhost augenrules[722]: lost 0
Jan 27 19:57:41 localhost augenrules[722]: backlog 0
Jan 27 19:57:41 localhost augenrules[722]: backlog_wait_time 60000
Jan 27 19:57:41 localhost augenrules[722]: backlog_wait_time_actual 0
Jan 27 19:57:41 localhost augenrules[722]: enabled 1
Jan 27 19:57:41 localhost augenrules[722]: failure 1
Jan 27 19:57:41 localhost augenrules[722]: pid 702
Jan 27 19:57:41 localhost augenrules[722]: rate_limit 0
Jan 27 19:57:41 localhost augenrules[722]: backlog_limit 8192
Jan 27 19:57:41 localhost augenrules[722]: lost 0
Jan 27 19:57:41 localhost augenrules[722]: backlog 3
Jan 27 19:57:41 localhost augenrules[722]: backlog_wait_time 60000
Jan 27 19:57:41 localhost augenrules[722]: backlog_wait_time_actual 0
Jan 27 19:57:41 localhost systemd[1]: Started Security Auditing Service.
Jan 27 19:57:41 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 27 19:57:41 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 27 19:57:42 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 27 19:57:42 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 27 19:57:42 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 27 19:57:42 localhost systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Jan 27 19:57:42 localhost systemd[1]: Starting Update is Completed...
Jan 27 19:57:42 localhost systemd[1]: Finished Update is Completed.
Jan 27 19:57:42 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 27 19:57:42 localhost systemd[1]: Reached target System Initialization.
Jan 27 19:57:42 localhost systemd[1]: Started dnf makecache --timer.
Jan 27 19:57:42 localhost systemd[1]: Started Daily rotation of log files.
Jan 27 19:57:42 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 27 19:57:42 localhost systemd[1]: Reached target Timer Units.
Jan 27 19:57:42 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 27 19:57:42 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 27 19:57:42 localhost systemd[1]: Reached target Socket Units.
Jan 27 19:57:42 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 27 19:57:42 localhost systemd-udevd[742]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 19:57:42 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 27 19:57:42 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 27 19:57:42 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 27 19:57:42 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 27 19:57:42 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 27 19:57:42 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 27 19:57:42 localhost systemd[1]: Reached target Basic System.
Jan 27 19:57:42 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 27 19:57:42 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 27 19:57:42 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 27 19:57:42 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 27 19:57:42 localhost dbus-broker-lau[758]: Ready
Jan 27 19:57:42 localhost systemd[1]: Starting NTP client/server...
Jan 27 19:57:42 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 27 19:57:42 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 27 19:57:42 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 27 19:57:42 localhost systemd[1]: Started irqbalance daemon.
Jan 27 19:57:42 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 27 19:57:42 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 27 19:57:42 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 27 19:57:42 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 27 19:57:42 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 27 19:57:42 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 27 19:57:42 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 27 19:57:42 localhost systemd[1]: Starting User Login Management...
Jan 27 19:57:42 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 27 19:57:42 localhost chronyd[800]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 27 19:57:42 localhost chronyd[800]: Loaded 0 symmetric keys
Jan 27 19:57:42 localhost chronyd[800]: Using right/UTC timezone to obtain leap second data
Jan 27 19:57:42 localhost chronyd[800]: Loaded seccomp filter (level 2)
Jan 27 19:57:42 localhost systemd[1]: Started NTP client/server.
Jan 27 19:57:42 localhost systemd-logind[786]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 27 19:57:42 localhost systemd-logind[786]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 27 19:57:42 localhost systemd-logind[786]: New seat seat0.
Jan 27 19:57:42 localhost systemd[1]: Started User Login Management.
Jan 27 19:57:42 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 27 19:57:42 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 27 19:57:42 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 27 19:57:42 localhost kernel: Console: switching to colour dummy device 80x25
Jan 27 19:57:42 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 27 19:57:42 localhost kernel: [drm] features: -context_init
Jan 27 19:57:42 localhost kernel: [drm] number of scanouts: 1
Jan 27 19:57:42 localhost kernel: [drm] number of cap sets: 0
Jan 27 19:57:42 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 27 19:57:42 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 27 19:57:42 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 27 19:57:42 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 27 19:57:42 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 27 19:57:42 localhost kernel: kvm_amd: TSC scaling supported
Jan 27 19:57:42 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 27 19:57:42 localhost kernel: kvm_amd: Nested Paging enabled
Jan 27 19:57:42 localhost kernel: kvm_amd: LBR virtualization supported
Jan 27 19:57:42 localhost iptables.init[777]: iptables: Applying firewall rules: [  OK  ]
Jan 27 19:57:42 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 27 19:57:43 localhost cloud-init[839]: Cloud-init v. 24.4-8.el9 running 'init-local' at Tue, 27 Jan 2026 19:57:43 +0000. Up 7.29 seconds.
Jan 27 19:57:43 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 27 19:57:43 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 27 19:57:43 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp5xmg_wvp.mount: Deactivated successfully.
Jan 27 19:57:43 localhost systemd[1]: Starting Hostname Service...
Jan 27 19:57:43 localhost systemd[1]: Started Hostname Service.
Jan 27 19:57:43 np0005598095.novalocal systemd-hostnamed[853]: Hostname set to <np0005598095.novalocal> (static)
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Reached target Preparation for Network.
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Starting Network Manager...
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8375] NetworkManager (version 1.54.3-2.el9) is starting... (boot:ee988f58-1e06-463e-8261-22f688d902e1)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8381] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8538] manager[0x55a5b31dc000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8586] hostname: hostname: using hostnamed
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8586] hostname: static hostname changed from (none) to "np0005598095.novalocal"
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8591] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8719] manager[0x55a5b31dc000]: rfkill: Wi-Fi hardware radio set enabled
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8720] manager[0x55a5b31dc000]: rfkill: WWAN hardware radio set enabled
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8844] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8845] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8845] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8846] manager: Networking is enabled by state file
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8848] settings: Loaded settings plugin: keyfile (internal)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8902] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8932] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8963] dhcp: init: Using DHCP client 'internal'
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8967] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8982] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.8999] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9027] device (lo): Activation: starting connection 'lo' (f9304b27-0492-4654-ac3b-87bfd4814846)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9039] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9043] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Started Network Manager.
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9125] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9132] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9135] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9139] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9142] device (eth0): carrier: link connected
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Reached target Network.
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9147] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9154] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9163] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9168] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9169] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9172] manager: NetworkManager state is now CONNECTING
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9174] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9181] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9185] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9233] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9240] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9261] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9267] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9268] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9273] device (lo): Activation: successful, device activated.
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9293] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9295] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9303] manager: NetworkManager state is now CONNECTED_SITE
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9308] device (eth0): Activation: successful, device activated.
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9314] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 27 19:57:43 np0005598095.novalocal NetworkManager[857]: <info>  [1769543863.9317] manager: startup complete
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Reached target NFS client services.
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Reached target Remote File Systems.
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 27 19:57:43 np0005598095.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: Cloud-init v. 24.4-8.el9 running 'init' at Tue, 27 Jan 2026 19:57:44 +0000. Up 8.35 seconds.
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.110         | 255.255.255.0 | global | fa:16:3e:fc:b1:e4 |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fefc:b1e4/64 |       .       |  link  | fa:16:3e:fc:b1:e4 |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 27 19:57:44 np0005598095.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 27 19:57:45 np0005598095.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Jan 27 19:57:45 np0005598095.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 27 19:57:45 np0005598095.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Jan 27 19:57:45 np0005598095.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Jan 27 19:57:45 np0005598095.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Jan 27 19:57:45 np0005598095.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Generating public/private rsa key pair.
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: The key fingerprint is:
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: SHA256:rULUbgVBLspup5+4IQaRTRlbZCJ5RZbHdM4jlBC7VCM root@np0005598095.novalocal
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: The key's randomart image is:
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: +---[RSA 3072]----+
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |..+*E*=o=.       |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |.=o*.=+B .       |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |o.o o.+ * .      |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: | . o + + +       |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |.   + . S .      |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: | . . . . .       |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |  o + o .        |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: | . o = o         |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |    +oo          |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: +----[SHA256]-----+
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: The key fingerprint is:
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: SHA256:++pToIU2Om9uclk6L0fWmNYG0leozadFOMEBbVockcM root@np0005598095.novalocal
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: The key's randomart image is:
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: +---[ECDSA 256]---+
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |        .*=O     |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |          E o    |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |       o B =     |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |      = B + o    |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |     o =SO +     |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |    o . B.*      |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |     o B.o       |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |    . X o.       |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |     *.*+o.      |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: +----[SHA256]-----+
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: The key fingerprint is:
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: SHA256:U340m8fysmzWOtguPcaZCAdnt7lki4t3xtMzkUK3qUM root@np0005598095.novalocal
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: The key's randomart image is:
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: +--[ED25519 256]--+
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |                 |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |                 |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |          . o    |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |        .oo.o=.  |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |        S+.o=+o+ |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |        ....E+=  |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |         o %.X.. |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |         .*.^o=  |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: |        ...X=* o |
Jan 27 19:57:45 np0005598095.novalocal cloud-init[923]: +----[SHA256]-----+
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Reached target Network is Online.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Starting System Logging Service...
Jan 27 19:57:45 np0005598095.novalocal sm-notify[1005]: Version 2.5.4 starting
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Starting Permit User Sessions...
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 27 19:57:45 np0005598095.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 27 19:57:45 np0005598095.novalocal sshd[1007]: Server listening on :: port 22.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Finished Permit User Sessions.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Started Command Scheduler.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Started Getty on tty1.
Jan 27 19:57:45 np0005598095.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Jan 27 19:57:45 np0005598095.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 27 19:57:45 np0005598095.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 13% if used.)
Jan 27 19:57:45 np0005598095.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Reached target Login Prompts.
Jan 27 19:57:45 np0005598095.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 27 19:57:45 np0005598095.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Started System Logging Service.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Reached target Multi-User System.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 27 19:57:45 np0005598095.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 27 19:57:45 np0005598095.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 19:57:46 np0005598095.novalocal kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Jan 27 19:57:46 np0005598095.novalocal kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1129]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Tue, 27 Jan 2026 19:57:46 +0000. Up 10.20 seconds.
Jan 27 19:57:46 np0005598095.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 27 19:57:46 np0005598095.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 27 19:57:46 np0005598095.novalocal dracut[1269]: dracut-057-102.git20250818.el9
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1285]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Tue, 27 Jan 2026 19:57:46 +0000. Up 10.57 seconds.
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1287]: #############################################################
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1288]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1291]: 256 SHA256:++pToIU2Om9uclk6L0fWmNYG0leozadFOMEBbVockcM root@np0005598095.novalocal (ECDSA)
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1293]: 256 SHA256:U340m8fysmzWOtguPcaZCAdnt7lki4t3xtMzkUK3qUM root@np0005598095.novalocal (ED25519)
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1296]: 3072 SHA256:rULUbgVBLspup5+4IQaRTRlbZCJ5RZbHdM4jlBC7VCM root@np0005598095.novalocal (RSA)
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1298]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1300]: #############################################################
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1289]: Connection reset by 38.102.83.114 port 52386 [preauth]
Jan 27 19:57:46 np0005598095.novalocal dracut[1271]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1302]: Unable to negotiate with 38.102.83.114 port 52388: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 27 19:57:46 np0005598095.novalocal cloud-init[1285]: Cloud-init v. 24.4-8.el9 finished at Tue, 27 Jan 2026 19:57:46 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.77 seconds
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1325]: Connection closed by 38.102.83.114 port 52398 [preauth]
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1334]: Unable to negotiate with 38.102.83.114 port 52404: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1346]: Unable to negotiate with 38.102.83.114 port 52414: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 27 19:57:46 np0005598095.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 27 19:57:46 np0005598095.novalocal systemd[1]: Reached target Cloud-init target.
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1366]: Unable to negotiate with 38.102.83.114 port 52452: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1368]: Unable to negotiate with 38.102.83.114 port 52468: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1355]: Connection closed by 38.102.83.114 port 52426 [preauth]
Jan 27 19:57:46 np0005598095.novalocal sshd-session[1364]: Connection closed by 38.102.83.114 port 52438 [preauth]
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: memstrack is not available
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 27 19:57:47 np0005598095.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: memstrack is not available
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 27 19:57:48 np0005598095.novalocal dracut[1271]: *** Including module: systemd ***
Jan 27 19:57:48 np0005598095.novalocal chronyd[800]: Selected source 198.181.199.86 (2.centos.pool.ntp.org)
Jan 27 19:57:48 np0005598095.novalocal chronyd[800]: System clock wrong by 1.734861 seconds
Jan 27 19:57:50 np0005598095.novalocal chronyd[800]: System clock was stepped by 1.734861 seconds
Jan 27 19:57:50 np0005598095.novalocal chronyd[800]: System clock TAI offset set to 37 seconds
Jan 27 19:57:50 np0005598095.novalocal dracut[1271]: *** Including module: fips ***
Jan 27 19:57:50 np0005598095.novalocal dracut[1271]: *** Including module: systemd-initrd ***
Jan 27 19:57:50 np0005598095.novalocal dracut[1271]: *** Including module: i18n ***
Jan 27 19:57:50 np0005598095.novalocal dracut[1271]: *** Including module: drm ***
Jan 27 19:57:51 np0005598095.novalocal dracut[1271]: *** Including module: prefixdevname ***
Jan 27 19:57:51 np0005598095.novalocal dracut[1271]: *** Including module: kernel-modules ***
Jan 27 19:57:51 np0005598095.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 27 19:57:51 np0005598095.novalocal dracut[1271]: *** Including module: kernel-modules-extra ***
Jan 27 19:57:51 np0005598095.novalocal dracut[1271]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 27 19:57:51 np0005598095.novalocal dracut[1271]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 27 19:57:51 np0005598095.novalocal dracut[1271]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 27 19:57:51 np0005598095.novalocal dracut[1271]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: *** Including module: qemu ***
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: *** Including module: fstab-sys ***
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: *** Including module: rootfs-block ***
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: *** Including module: terminfo ***
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: *** Including module: udev-rules ***
Jan 27 19:57:52 np0005598095.novalocal chronyd[800]: Selected source 148.113.173.205 (2.centos.pool.ntp.org)
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: Skipping udev rule: 91-permissions.rules
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: *** Including module: virtiofs ***
Jan 27 19:57:52 np0005598095.novalocal dracut[1271]: *** Including module: dracut-systemd ***
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]: *** Including module: usrmount ***
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]: *** Including module: base ***
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]: *** Including module: fs-lib ***
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]: *** Including module: kdumpbase ***
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:   microcode_ctl module: mangling fw_dir
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel" is ignored
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 27 19:57:53 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]: *** Including module: openssl ***
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]: *** Including module: shutdown ***
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]: *** Including module: squash ***
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]: *** Including modules done ***
Jan 27 19:57:54 np0005598095.novalocal dracut[1271]: *** Installing kernel module dependencies ***
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 35 affinity is now unmanaged
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 33 affinity is now unmanaged
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 31 affinity is now unmanaged
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 28 affinity is now unmanaged
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 34 affinity is now unmanaged
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 32 affinity is now unmanaged
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 30 affinity is now unmanaged
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 27 19:57:54 np0005598095.novalocal irqbalance[779]: IRQ 29 affinity is now unmanaged
Jan 27 19:57:55 np0005598095.novalocal dracut[1271]: *** Installing kernel module dependencies done ***
Jan 27 19:57:55 np0005598095.novalocal dracut[1271]: *** Resolving executable dependencies ***
Jan 27 19:57:55 np0005598095.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 27 19:57:57 np0005598095.novalocal dracut[1271]: *** Resolving executable dependencies done ***
Jan 27 19:57:57 np0005598095.novalocal dracut[1271]: *** Generating early-microcode cpio image ***
Jan 27 19:57:57 np0005598095.novalocal dracut[1271]: *** Store current command line parameters ***
Jan 27 19:57:57 np0005598095.novalocal dracut[1271]: Stored kernel commandline:
Jan 27 19:57:57 np0005598095.novalocal dracut[1271]: No dracut internal kernel commandline stored in the initramfs
Jan 27 19:57:57 np0005598095.novalocal dracut[1271]: *** Install squash loader ***
Jan 27 19:57:58 np0005598095.novalocal dracut[1271]: *** Squashing the files inside the initramfs ***
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: *** Squashing the files inside the initramfs done ***
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: *** Hardlinking files ***
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: Mode:           real
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: Files:          50
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: Linked:         0 files
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: Compared:       0 xattrs
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: Compared:       0 files
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: Saved:          0 B
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: Duration:       0.000455 seconds
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: *** Hardlinking files done ***
Jan 27 19:57:59 np0005598095.novalocal dracut[1271]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 27 19:58:00 np0005598095.novalocal kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Jan 27 19:58:00 np0005598095.novalocal kdumpctl[1016]: kdump: Starting kdump: [OK]
Jan 27 19:58:00 np0005598095.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 27 19:58:00 np0005598095.novalocal systemd[1]: Startup finished in 1.976s (kernel) + 2.612s (initrd) + 18.179s (userspace) = 22.768s.
Jan 27 19:58:15 np0005598095.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 27 19:58:19 np0005598095.novalocal sshd-session[4306]: Invalid user solv from 80.94.92.186 port 48138
Jan 27 19:58:19 np0005598095.novalocal sshd-session[4306]: Connection closed by invalid user solv 80.94.92.186 port 48138 [preauth]
Jan 27 19:58:39 np0005598095.novalocal sshd-session[4308]: Accepted publickey for zuul from 38.102.83.114 port 41084 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 27 19:58:39 np0005598095.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 27 19:58:39 np0005598095.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 27 19:58:39 np0005598095.novalocal systemd-logind[786]: New session 1 of user zuul.
Jan 27 19:58:39 np0005598095.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 27 19:58:39 np0005598095.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Queued start job for default target Main User Target.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Created slice User Application Slice.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Started Daily Cleanup of User's Temporary Directories.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Reached target Paths.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Reached target Timers.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Starting D-Bus User Message Bus Socket...
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Starting Create User's Volatile Files and Directories...
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Finished Create User's Volatile Files and Directories.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Listening on D-Bus User Message Bus Socket.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Reached target Sockets.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Reached target Basic System.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Reached target Main User Target.
Jan 27 19:58:39 np0005598095.novalocal systemd[4312]: Startup finished in 150ms.
Jan 27 19:58:39 np0005598095.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 27 19:58:39 np0005598095.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 27 19:58:39 np0005598095.novalocal sshd-session[4308]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 19:58:40 np0005598095.novalocal python3[4394]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 19:58:45 np0005598095.novalocal python3[4422]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 19:58:51 np0005598095.novalocal python3[4480]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 19:58:52 np0005598095.novalocal python3[4520]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 27 19:58:54 np0005598095.novalocal python3[4546]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDCUgzeEFv/66GGbC691iPDTJPFXJKUxPy8c2jL91rl4EqJfZEj05PzBW/J/oeHRBACWsEEtQob3j2nI7gWgfjadcFj9oaIIUiL727lOgFHcqmugTiyxY9homS9ZViCMt2siUZmQMVAgv7rsT55njVnab9lsYteRUHykyYKi2x1QoYbT45pFzUIwjrV7Vlzire7748s/5TLFw/PhpPDqk6MKtiJGhQLAm02vBWacs4xeFJ6M/JfAnDNMaBSHwF8Mg21Bw/wR9Fl3wb5EYGSxq+mGjKzuogUc96oOx9GB/lPmcdh1YI8B4z6GyVvydpWd0jpmR2AyHL91c1migjf6YSwTnoCFthfwBz0sCC0Ya+VHhmeQb6y0r1qkbQ7kGIcoaYMy4PakgoLP/CKvIP+h/EOSFF7qiS3WUi9v/G/aJWqZUf8jeBGBwY5DFIrnyjLux+JlP5GnOL4dXrBNTGb/xvVS3wut9+hhrIoF51UMeTOJHH769hRO9BzD54YWTyZhbU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:58:54 np0005598095.novalocal python3[4570]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:58:54 np0005598095.novalocal python3[4669]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 19:58:55 np0005598095.novalocal python3[4740]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769543934.6875365-230-194891127235860/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=3619b2b3181d43caa9277725393b264f_id_rsa follow=False checksum=705ae35380669bf9d2c17e135bd947c9c6ac6ddb backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:58:55 np0005598095.novalocal python3[4863]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 19:58:56 np0005598095.novalocal python3[4934]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769543935.58784-274-110972467772098/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=3619b2b3181d43caa9277725393b264f_id_rsa.pub follow=False checksum=b0d27f7ee580fd54041de4322b69d235984eefa9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:58:57 np0005598095.novalocal python3[4982]: ansible-ping Invoked with data=pong
Jan 27 19:58:58 np0005598095.novalocal python3[5006]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 19:59:00 np0005598095.novalocal python3[5064]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 27 19:59:02 np0005598095.novalocal python3[5096]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:02 np0005598095.novalocal python3[5120]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:02 np0005598095.novalocal python3[5144]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:03 np0005598095.novalocal python3[5168]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:03 np0005598095.novalocal python3[5192]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:03 np0005598095.novalocal python3[5216]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:06 np0005598095.novalocal sudo[5240]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbousjnpsdlczyvlxfwgotixlqjylwic ; /usr/bin/python3'
Jan 27 19:59:06 np0005598095.novalocal sudo[5240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:06 np0005598095.novalocal python3[5242]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:06 np0005598095.novalocal sudo[5240]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:06 np0005598095.novalocal sudo[5318]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyomosgcccbdaxmvsnmgrqvwbmyonjwl ; /usr/bin/python3'
Jan 27 19:59:06 np0005598095.novalocal sudo[5318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:07 np0005598095.novalocal python3[5320]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 19:59:07 np0005598095.novalocal sudo[5318]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:07 np0005598095.novalocal sudo[5391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaozstyoxmlothratafnrihtqomyuprn ; /usr/bin/python3'
Jan 27 19:59:07 np0005598095.novalocal sudo[5391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:07 np0005598095.novalocal python3[5393]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769543946.6406844-28-271782169787406/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:07 np0005598095.novalocal sudo[5391]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:08 np0005598095.novalocal python3[5441]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:08 np0005598095.novalocal python3[5465]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:08 np0005598095.novalocal python3[5489]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:09 np0005598095.novalocal python3[5513]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:09 np0005598095.novalocal python3[5537]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:09 np0005598095.novalocal python3[5561]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:09 np0005598095.novalocal python3[5585]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:10 np0005598095.novalocal python3[5609]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:10 np0005598095.novalocal python3[5633]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:10 np0005598095.novalocal python3[5657]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:10 np0005598095.novalocal python3[5681]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:11 np0005598095.novalocal python3[5705]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:11 np0005598095.novalocal python3[5729]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:11 np0005598095.novalocal python3[5753]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:11 np0005598095.novalocal python3[5777]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:12 np0005598095.novalocal python3[5801]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:12 np0005598095.novalocal python3[5825]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:12 np0005598095.novalocal python3[5849]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:13 np0005598095.novalocal python3[5873]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:13 np0005598095.novalocal python3[5897]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:13 np0005598095.novalocal python3[5921]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:13 np0005598095.novalocal python3[5945]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:14 np0005598095.novalocal python3[5969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:14 np0005598095.novalocal python3[5993]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:14 np0005598095.novalocal python3[6017]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:15 np0005598095.novalocal python3[6041]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 19:59:17 np0005598095.novalocal sudo[6065]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmiknwigpdlrlfxkoqusscvvvjdhutnk ; /usr/bin/python3'
Jan 27 19:59:17 np0005598095.novalocal sudo[6065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:17 np0005598095.novalocal python3[6067]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 27 19:59:17 np0005598095.novalocal systemd[1]: Starting Time & Date Service...
Jan 27 19:59:17 np0005598095.novalocal systemd[1]: Started Time & Date Service.
Jan 27 19:59:17 np0005598095.novalocal systemd-timedated[6069]: Changed time zone to 'UTC' (UTC).
Jan 27 19:59:17 np0005598095.novalocal sudo[6065]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:18 np0005598095.novalocal sudo[6096]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnbjvufyqxkzpkebmravsclhosfcnyhe ; /usr/bin/python3'
Jan 27 19:59:18 np0005598095.novalocal sudo[6096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:18 np0005598095.novalocal python3[6098]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:18 np0005598095.novalocal sudo[6096]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:18 np0005598095.novalocal python3[6174]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 19:59:19 np0005598095.novalocal python3[6245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769543958.3936327-203-27243881632161/source _original_basename=tmphimml_4g follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:19 np0005598095.novalocal python3[6345]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 19:59:20 np0005598095.novalocal python3[6416]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769543959.330275-244-153904268801695/source _original_basename=tmp16qm3qhp follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:20 np0005598095.novalocal sudo[6516]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqfgikgmcborazpefzbuqwpqozasajhk ; /usr/bin/python3'
Jan 27 19:59:20 np0005598095.novalocal sudo[6516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:20 np0005598095.novalocal python3[6518]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 19:59:20 np0005598095.novalocal sudo[6516]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:21 np0005598095.novalocal sudo[6589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubunotxiqqlrsgqjqxckzqlfdunrhknl ; /usr/bin/python3'
Jan 27 19:59:21 np0005598095.novalocal sudo[6589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:21 np0005598095.novalocal python3[6591]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769543960.4766037-307-121538303025092/source _original_basename=tmp95yx6jy_ follow=False checksum=54ceff67f46a00e80734f8bde7b737fc4d565204 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:21 np0005598095.novalocal sudo[6589]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:21 np0005598095.novalocal python3[6639]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 19:59:21 np0005598095.novalocal python3[6665]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 19:59:22 np0005598095.novalocal sudo[6743]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftlsarveksbssftjrdgbdczuaiamlqix ; /usr/bin/python3'
Jan 27 19:59:22 np0005598095.novalocal sudo[6743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:22 np0005598095.novalocal python3[6745]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 19:59:22 np0005598095.novalocal sudo[6743]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:22 np0005598095.novalocal sudo[6816]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmdjkvapukkevgtcgltnmmpysfsrqkio ; /usr/bin/python3'
Jan 27 19:59:22 np0005598095.novalocal sudo[6816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:22 np0005598095.novalocal python3[6818]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769543962.1504817-363-160174904017001/source _original_basename=tmp1bi1g68t follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:22 np0005598095.novalocal sudo[6816]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:23 np0005598095.novalocal sudo[6867]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmrbsfstcpxzdogmbgllaobdxzplhvkb ; /usr/bin/python3'
Jan 27 19:59:23 np0005598095.novalocal sudo[6867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:23 np0005598095.novalocal python3[6869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-bc85-a0dc-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 19:59:23 np0005598095.novalocal sudo[6867]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:24 np0005598095.novalocal python3[6897]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-bc85-a0dc-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 27 19:59:25 np0005598095.novalocal python3[6925]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:45 np0005598095.novalocal sudo[6949]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjmizecukprxqskcjnnppqpclnbanwwl ; /usr/bin/python3'
Jan 27 19:59:45 np0005598095.novalocal sudo[6949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 19:59:45 np0005598095.novalocal python3[6951]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 19:59:45 np0005598095.novalocal sudo[6949]: pam_unix(sudo:session): session closed for user root
Jan 27 19:59:47 np0005598095.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 27 20:00:40 np0005598095.novalocal systemd[4312]: Starting Mark boot as successful...
Jan 27 20:00:40 np0005598095.novalocal systemd[4312]: Finished Mark boot as successful.
Jan 27 20:00:45 np0005598095.novalocal sshd-session[4321]: Received disconnect from 38.102.83.114 port 41084:11: disconnected by user
Jan 27 20:00:45 np0005598095.novalocal sshd-session[4321]: Disconnected from user zuul 38.102.83.114 port 41084
Jan 27 20:00:45 np0005598095.novalocal sshd-session[4308]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:00:45 np0005598095.novalocal systemd-logind[786]: Session 1 logged out. Waiting for processes to exit.
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 27 20:00:47 np0005598095.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 27 20:00:47 np0005598095.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4642] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 27 20:00:47 np0005598095.novalocal systemd-udevd[6956]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4856] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4892] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4897] device (eth1): carrier: link connected
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4900] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4911] policy: auto-activating connection 'Wired connection 1' (2719f028-bb33-30c6-adfd-93b66b18c548)
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4917] device (eth1): Activation: starting connection 'Wired connection 1' (2719f028-bb33-30c6-adfd-93b66b18c548)
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4919] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4923] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4929] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:00:47 np0005598095.novalocal NetworkManager[857]: <info>  [1769544047.4937] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:00:48 np0005598095.novalocal sshd-session[6959]: Accepted publickey for zuul from 38.102.83.114 port 54014 ssh2: RSA SHA256:KbiQ7dOB9mL82DEiBFdAeiKAgIiBJoqnrsw9aytL3+4
Jan 27 20:00:48 np0005598095.novalocal systemd-logind[786]: New session 3 of user zuul.
Jan 27 20:00:48 np0005598095.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 27 20:00:48 np0005598095.novalocal sshd-session[6959]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:00:48 np0005598095.novalocal python3[6986]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-7ddc-350b-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:00:55 np0005598095.novalocal sudo[7064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psvvkwlcbinxdzxgurorjxdylwryzpyh ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 27 20:00:55 np0005598095.novalocal sudo[7064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:00:55 np0005598095.novalocal python3[7066]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:00:55 np0005598095.novalocal sudo[7064]: pam_unix(sudo:session): session closed for user root
Jan 27 20:00:55 np0005598095.novalocal sudo[7137]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ousltdrixkqokoweenjgwfdmmnxaqbyc ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 27 20:00:55 np0005598095.novalocal sudo[7137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:00:56 np0005598095.novalocal python3[7139]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769544055.3789744-154-267129382679321/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=8d67b3821e2524f339e065d6011df9ea055a4026 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:00:56 np0005598095.novalocal sudo[7137]: pam_unix(sudo:session): session closed for user root
Jan 27 20:00:56 np0005598095.novalocal sudo[7187]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lidewwzbuzhzqgsamxwgyvzdrmysndgu ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 27 20:00:56 np0005598095.novalocal sudo[7187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:00:56 np0005598095.novalocal python3[7189]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Stopping Network Manager...
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6774] caught SIGTERM, shutting down normally.
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6792] dhcp4 (eth0): canceled DHCP transaction
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6794] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6795] dhcp4 (eth0): state changed no lease
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6797] manager: NetworkManager state is now CONNECTING
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6900] dhcp4 (eth1): canceled DHCP transaction
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6901] dhcp4 (eth1): state changed no lease
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[857]: <info>  [1769544056.6954] exiting (success)
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Stopped Network Manager.
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: NetworkManager.service: Consumed 1.597s CPU time, 10.0M memory peak.
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Starting Network Manager...
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.7680] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ee988f58-1e06-463e-8261-22f688d902e1)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.7681] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.7743] manager[0x557e5fb97000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Starting Hostname Service...
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Started Hostname Service.
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8878] hostname: hostname: using hostnamed
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8879] hostname: static hostname changed from (none) to "np0005598095.novalocal"
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8886] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8893] manager[0x557e5fb97000]: rfkill: Wi-Fi hardware radio set enabled
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8893] manager[0x557e5fb97000]: rfkill: WWAN hardware radio set enabled
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8925] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8925] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8926] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8926] manager: Networking is enabled by state file
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8928] settings: Loaded settings plugin: keyfile (internal)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8932] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8960] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8971] dhcp: init: Using DHCP client 'internal'
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8974] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8979] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8985] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.8993] device (lo): Activation: starting connection 'lo' (f9304b27-0492-4654-ac3b-87bfd4814846)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9000] device (eth0): carrier: link connected
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9004] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9009] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9010] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9015] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9021] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9027] device (eth1): carrier: link connected
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9031] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9037] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (2719f028-bb33-30c6-adfd-93b66b18c548) (indicated)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9038] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9044] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9052] device (eth1): Activation: starting connection 'Wired connection 1' (2719f028-bb33-30c6-adfd-93b66b18c548)
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Started Network Manager.
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9060] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9067] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9072] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9075] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9078] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9092] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9094] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9096] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9098] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9102] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9105] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9111] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9113] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9126] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9127] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9130] device (lo): Activation: successful, device activated.
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9144] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9148] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9212] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9244] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9246] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9247] manager: NetworkManager state is now CONNECTED_SITE
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9249] device (eth0): Activation: successful, device activated.
Jan 27 20:00:56 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544056.9252] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 27 20:00:56 np0005598095.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 27 20:00:56 np0005598095.novalocal sudo[7187]: pam_unix(sudo:session): session closed for user root
Jan 27 20:00:57 np0005598095.novalocal python3[7273]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-7ddc-350b-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:01:01 np0005598095.novalocal CROND[7277]: (root) CMD (run-parts /etc/cron.hourly)
Jan 27 20:01:01 np0005598095.novalocal run-parts[7280]: (/etc/cron.hourly) starting 0anacron
Jan 27 20:01:01 np0005598095.novalocal anacron[7288]: Anacron started on 2026-01-27
Jan 27 20:01:01 np0005598095.novalocal anacron[7288]: Will run job `cron.daily' in 8 min.
Jan 27 20:01:01 np0005598095.novalocal anacron[7288]: Will run job `cron.weekly' in 28 min.
Jan 27 20:01:01 np0005598095.novalocal anacron[7288]: Will run job `cron.monthly' in 48 min.
Jan 27 20:01:01 np0005598095.novalocal anacron[7288]: Jobs will be executed sequentially
Jan 27 20:01:01 np0005598095.novalocal run-parts[7290]: (/etc/cron.hourly) finished 0anacron
Jan 27 20:01:01 np0005598095.novalocal CROND[7276]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 27 20:01:07 np0005598095.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 27 20:01:27 np0005598095.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.6817] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 27 20:01:41 np0005598095.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 27 20:01:41 np0005598095.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7134] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7139] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7149] device (eth1): Activation: successful, device activated.
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7158] manager: startup complete
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7162] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <warn>  [1769544101.7169] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7181] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7270] dhcp4 (eth1): canceled DHCP transaction
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7270] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7270] dhcp4 (eth1): state changed no lease
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7292] policy: auto-activating connection 'ci-private-network' (d9c8ffab-7346-50a5-a026-9fa25258efe7)
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7300] device (eth1): Activation: starting connection 'ci-private-network' (d9c8ffab-7346-50a5-a026-9fa25258efe7)
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7302] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7305] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7317] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7332] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7848] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7851] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:01:41 np0005598095.novalocal NetworkManager[7199]: <info>  [1769544101.7860] device (eth1): Activation: successful, device activated.
Jan 27 20:01:51 np0005598095.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 27 20:01:57 np0005598095.novalocal sshd-session[6962]: Received disconnect from 38.102.83.114 port 54014:11: disconnected by user
Jan 27 20:01:57 np0005598095.novalocal sshd-session[6962]: Disconnected from user zuul 38.102.83.114 port 54014
Jan 27 20:01:57 np0005598095.novalocal sshd-session[6959]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:01:57 np0005598095.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 27 20:01:57 np0005598095.novalocal systemd[1]: session-3.scope: Consumed 1.642s CPU time.
Jan 27 20:01:57 np0005598095.novalocal systemd-logind[786]: Session 3 logged out. Waiting for processes to exit.
Jan 27 20:01:57 np0005598095.novalocal systemd-logind[786]: Removed session 3.
Jan 27 20:02:01 np0005598095.novalocal sshd-session[7317]: Accepted publickey for zuul from 38.102.83.114 port 35226 ssh2: RSA SHA256:KbiQ7dOB9mL82DEiBFdAeiKAgIiBJoqnrsw9aytL3+4
Jan 27 20:02:01 np0005598095.novalocal systemd-logind[786]: New session 4 of user zuul.
Jan 27 20:02:01 np0005598095.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 27 20:02:01 np0005598095.novalocal sshd-session[7317]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:02:01 np0005598095.novalocal sudo[7396]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emylzcscnnqurjefhqhnrtebrbemtmgl ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 27 20:02:01 np0005598095.novalocal sudo[7396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:02:01 np0005598095.novalocal python3[7398]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:02:01 np0005598095.novalocal sudo[7396]: pam_unix(sudo:session): session closed for user root
Jan 27 20:02:01 np0005598095.novalocal sudo[7469]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-livbstvnjxvcuiysogvuerbsqlrofixd ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 27 20:02:01 np0005598095.novalocal sudo[7469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:02:01 np0005598095.novalocal python3[7471]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769544121.3203042-312-17226387771174/source _original_basename=tmpwkq_28i7 follow=False checksum=5050ae8f82f406a6bda73d2a009b70bc00a3513c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:02:01 np0005598095.novalocal sudo[7469]: pam_unix(sudo:session): session closed for user root
Jan 27 20:02:04 np0005598095.novalocal sshd-session[7320]: Connection closed by 38.102.83.114 port 35226
Jan 27 20:02:04 np0005598095.novalocal sshd-session[7317]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:02:04 np0005598095.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 27 20:02:04 np0005598095.novalocal systemd-logind[786]: Session 4 logged out. Waiting for processes to exit.
Jan 27 20:02:04 np0005598095.novalocal systemd-logind[786]: Removed session 4.
Jan 27 20:02:46 np0005598095.novalocal sshd-session[7496]: Invalid user debian from 62.106.95.229 port 56858
Jan 27 20:02:46 np0005598095.novalocal sshd-session[7496]: Connection closed by invalid user debian 62.106.95.229 port 56858 [preauth]
Jan 27 20:02:52 np0005598095.novalocal sshd-session[7498]: Invalid user debian from 62.106.95.229 port 56864
Jan 27 20:02:52 np0005598095.novalocal sshd-session[7498]: Connection closed by invalid user debian 62.106.95.229 port 56864 [preauth]
Jan 27 20:02:55 np0005598095.novalocal sshd-session[7500]: Invalid user debian from 62.106.95.229 port 56872
Jan 27 20:02:56 np0005598095.novalocal sshd-session[7500]: Connection closed by invalid user debian 62.106.95.229 port 56872 [preauth]
Jan 27 20:03:01 np0005598095.novalocal sshd-session[7502]: Invalid user debian from 62.106.95.229 port 56878
Jan 27 20:03:02 np0005598095.novalocal sshd-session[7502]: Connection closed by invalid user debian 62.106.95.229 port 56878 [preauth]
Jan 27 20:03:04 np0005598095.novalocal sshd-session[7504]: Invalid user debian from 62.106.95.229 port 56880
Jan 27 20:03:05 np0005598095.novalocal sshd-session[7504]: Connection closed by invalid user debian 62.106.95.229 port 56880 [preauth]
Jan 27 20:03:06 np0005598095.novalocal sshd-session[7506]: Invalid user solv from 80.94.92.186 port 51158
Jan 27 20:03:06 np0005598095.novalocal sshd-session[7506]: Connection closed by invalid user solv 80.94.92.186 port 51158 [preauth]
Jan 27 20:03:09 np0005598095.novalocal sshd-session[7508]: Invalid user debian from 62.106.95.229 port 56886
Jan 27 20:03:09 np0005598095.novalocal sshd-session[7508]: Connection closed by invalid user debian 62.106.95.229 port 56886 [preauth]
Jan 27 20:03:13 np0005598095.novalocal sshd-session[7510]: Invalid user debian from 62.106.95.229 port 56892
Jan 27 20:03:13 np0005598095.novalocal sshd-session[7510]: Connection closed by invalid user debian 62.106.95.229 port 56892 [preauth]
Jan 27 20:03:17 np0005598095.novalocal sshd-session[7512]: Invalid user debian from 62.106.95.229 port 56896
Jan 27 20:03:17 np0005598095.novalocal sshd-session[7512]: Connection closed by invalid user debian 62.106.95.229 port 56896 [preauth]
Jan 27 20:03:19 np0005598095.novalocal sshd-session[7514]: Invalid user debian from 62.106.95.229 port 56902
Jan 27 20:03:19 np0005598095.novalocal sshd-session[7514]: Connection closed by invalid user debian 62.106.95.229 port 56902 [preauth]
Jan 27 20:03:23 np0005598095.novalocal sshd-session[7516]: Invalid user debian from 62.106.95.229 port 56908
Jan 27 20:03:24 np0005598095.novalocal sshd-session[7516]: Connection closed by invalid user debian 62.106.95.229 port 56908 [preauth]
Jan 27 20:03:27 np0005598095.novalocal sshd-session[7518]: Invalid user debian from 62.106.95.229 port 56910
Jan 27 20:03:27 np0005598095.novalocal sshd-session[7518]: Connection closed by invalid user debian 62.106.95.229 port 56910 [preauth]
Jan 27 20:03:32 np0005598095.novalocal sshd-session[7520]: Invalid user debian from 62.106.95.229 port 56916
Jan 27 20:03:32 np0005598095.novalocal sshd-session[7520]: Connection closed by invalid user debian 62.106.95.229 port 56916 [preauth]
Jan 27 20:03:35 np0005598095.novalocal sshd-session[7522]: Invalid user debian from 62.106.95.229 port 56922
Jan 27 20:03:35 np0005598095.novalocal sshd-session[7522]: Connection closed by invalid user debian 62.106.95.229 port 56922 [preauth]
Jan 27 20:03:37 np0005598095.novalocal sshd-session[7524]: Invalid user debian from 62.106.95.229 port 56926
Jan 27 20:03:37 np0005598095.novalocal sshd-session[7524]: Connection closed by invalid user debian 62.106.95.229 port 56926 [preauth]
Jan 27 20:03:40 np0005598095.novalocal sshd-session[7526]: Invalid user debian from 62.106.95.229 port 56930
Jan 27 20:03:40 np0005598095.novalocal systemd[4312]: Created slice User Background Tasks Slice.
Jan 27 20:03:40 np0005598095.novalocal systemd[4312]: Starting Cleanup of User's Temporary Files and Directories...
Jan 27 20:03:40 np0005598095.novalocal systemd[4312]: Finished Cleanup of User's Temporary Files and Directories.
Jan 27 20:03:40 np0005598095.novalocal sshd-session[7526]: Connection closed by invalid user debian 62.106.95.229 port 56930 [preauth]
Jan 27 20:03:43 np0005598095.novalocal sshd-session[7531]: Invalid user debian from 62.106.95.229 port 56936
Jan 27 20:03:43 np0005598095.novalocal sshd-session[7531]: Connection closed by invalid user debian 62.106.95.229 port 56936 [preauth]
Jan 27 20:03:44 np0005598095.novalocal sshd-session[7533]: Invalid user debian from 62.106.95.229 port 56938
Jan 27 20:03:45 np0005598095.novalocal sshd-session[7533]: Connection closed by invalid user debian 62.106.95.229 port 56938 [preauth]
Jan 27 20:03:47 np0005598095.novalocal sshd-session[7535]: Invalid user debian from 62.106.95.229 port 56940
Jan 27 20:03:49 np0005598095.novalocal sshd-session[7535]: Connection closed by invalid user debian 62.106.95.229 port 56940 [preauth]
Jan 27 20:03:54 np0005598095.novalocal sshd-session[7537]: Invalid user debian from 62.106.95.229 port 56946
Jan 27 20:03:54 np0005598095.novalocal sshd-session[7537]: Connection closed by invalid user debian 62.106.95.229 port 56946 [preauth]
Jan 27 20:03:57 np0005598095.novalocal sshd-session[7539]: Invalid user debian from 62.106.95.229 port 56954
Jan 27 20:03:58 np0005598095.novalocal sshd-session[7539]: Connection closed by invalid user debian 62.106.95.229 port 56954 [preauth]
Jan 27 20:04:00 np0005598095.novalocal sshd-session[7541]: Invalid user debian from 62.106.95.229 port 56962
Jan 27 20:04:01 np0005598095.novalocal sshd-session[7541]: Connection closed by invalid user debian 62.106.95.229 port 56962 [preauth]
Jan 27 20:04:03 np0005598095.novalocal sshd-session[7543]: Invalid user debian from 62.106.95.229 port 56966
Jan 27 20:04:04 np0005598095.novalocal sshd-session[7543]: Connection closed by invalid user debian 62.106.95.229 port 56966 [preauth]
Jan 27 20:04:06 np0005598095.novalocal sshd-session[7545]: Invalid user debian from 62.106.95.229 port 56972
Jan 27 20:04:06 np0005598095.novalocal sshd-session[7545]: Connection closed by invalid user debian 62.106.95.229 port 56972 [preauth]
Jan 27 20:04:08 np0005598095.novalocal sshd-session[7547]: Invalid user debian from 62.106.95.229 port 56974
Jan 27 20:04:08 np0005598095.novalocal sshd-session[7547]: Connection closed by invalid user debian 62.106.95.229 port 56974 [preauth]
Jan 27 20:04:10 np0005598095.novalocal sshd-session[7549]: Invalid user debian from 62.106.95.229 port 56976
Jan 27 20:04:10 np0005598095.novalocal sshd-session[7549]: Connection closed by invalid user debian 62.106.95.229 port 56976 [preauth]
Jan 27 20:04:13 np0005598095.novalocal sshd-session[7551]: Invalid user debian from 62.106.95.229 port 56982
Jan 27 20:04:14 np0005598095.novalocal sshd-session[7551]: Connection closed by invalid user debian 62.106.95.229 port 56982 [preauth]
Jan 27 20:04:16 np0005598095.novalocal sshd-session[7553]: Invalid user debian from 62.106.95.229 port 56988
Jan 27 20:04:16 np0005598095.novalocal sshd-session[7553]: Connection closed by invalid user debian 62.106.95.229 port 56988 [preauth]
Jan 27 20:04:22 np0005598095.novalocal sshd-session[7555]: Invalid user debian from 62.106.95.229 port 56990
Jan 27 20:04:23 np0005598095.novalocal sshd-session[7555]: Connection closed by invalid user debian 62.106.95.229 port 56990 [preauth]
Jan 27 20:04:27 np0005598095.novalocal sshd-session[7557]: Invalid user debian from 62.106.95.229 port 57000
Jan 27 20:04:27 np0005598095.novalocal sshd-session[7557]: Connection closed by invalid user debian 62.106.95.229 port 57000 [preauth]
Jan 27 20:04:28 np0005598095.novalocal sshd-session[7559]: Invalid user debian from 62.106.95.229 port 57010
Jan 27 20:04:28 np0005598095.novalocal sshd-session[7559]: Connection closed by invalid user debian 62.106.95.229 port 57010 [preauth]
Jan 27 20:04:32 np0005598095.novalocal sshd-session[7562]: Invalid user debian from 62.106.95.229 port 57012
Jan 27 20:04:33 np0005598095.novalocal sshd-session[7562]: Connection closed by invalid user debian 62.106.95.229 port 57012 [preauth]
Jan 27 20:04:33 np0005598095.novalocal sshd-session[7564]: Invalid user debian from 62.106.95.229 port 57020
Jan 27 20:04:34 np0005598095.novalocal sshd-session[7564]: Connection closed by invalid user debian 62.106.95.229 port 57020 [preauth]
Jan 27 20:04:37 np0005598095.novalocal sshd-session[7566]: Invalid user debian from 62.106.95.229 port 57022
Jan 27 20:04:37 np0005598095.novalocal sshd-session[7566]: Connection closed by invalid user debian 62.106.95.229 port 57022 [preauth]
Jan 27 20:04:43 np0005598095.novalocal sshd-session[7568]: Invalid user admin from 62.106.95.229 port 57026
Jan 27 20:04:43 np0005598095.novalocal sshd-session[7568]: Connection closed by invalid user admin 62.106.95.229 port 57026 [preauth]
Jan 27 20:04:45 np0005598095.novalocal sshd-session[7570]: Invalid user admin from 62.106.95.229 port 57028
Jan 27 20:04:46 np0005598095.novalocal sshd-session[7570]: Connection closed by invalid user admin 62.106.95.229 port 57028 [preauth]
Jan 27 20:04:47 np0005598095.novalocal sshd-session[7572]: Invalid user admin from 62.106.95.229 port 57032
Jan 27 20:04:47 np0005598095.novalocal sshd-session[7572]: Connection closed by invalid user admin 62.106.95.229 port 57032 [preauth]
Jan 27 20:04:49 np0005598095.novalocal sshd-session[7574]: Invalid user admin from 62.106.95.229 port 57036
Jan 27 20:04:49 np0005598095.novalocal sshd-session[7574]: Connection closed by invalid user admin 62.106.95.229 port 57036 [preauth]
Jan 27 20:04:52 np0005598095.novalocal sshd-session[7576]: Invalid user admin from 62.106.95.229 port 57038
Jan 27 20:04:53 np0005598095.novalocal sshd-session[7576]: Connection closed by invalid user admin 62.106.95.229 port 57038 [preauth]
Jan 27 20:04:55 np0005598095.novalocal sshd-session[7578]: Invalid user admin from 62.106.95.229 port 57044
Jan 27 20:04:55 np0005598095.novalocal sshd-session[7578]: Connection closed by invalid user admin 62.106.95.229 port 57044 [preauth]
Jan 27 20:04:58 np0005598095.novalocal sshd-session[7580]: Invalid user admin from 62.106.95.229 port 57050
Jan 27 20:04:58 np0005598095.novalocal sshd-session[7580]: Connection closed by invalid user admin 62.106.95.229 port 57050 [preauth]
Jan 27 20:05:02 np0005598095.novalocal sshd-session[7582]: Invalid user admin from 62.106.95.229 port 57052
Jan 27 20:05:03 np0005598095.novalocal sshd-session[7582]: Connection closed by invalid user admin 62.106.95.229 port 57052 [preauth]
Jan 27 20:05:05 np0005598095.novalocal sshd-session[7584]: Invalid user admin from 62.106.95.229 port 57054
Jan 27 20:05:05 np0005598095.novalocal sshd-session[7584]: Connection closed by invalid user admin 62.106.95.229 port 57054 [preauth]
Jan 27 20:05:08 np0005598095.novalocal sshd-session[7586]: Invalid user admin from 62.106.95.229 port 57058
Jan 27 20:05:09 np0005598095.novalocal sshd-session[7586]: Connection closed by invalid user admin 62.106.95.229 port 57058 [preauth]
Jan 27 20:05:10 np0005598095.novalocal sshd-session[7588]: Invalid user admin from 62.106.95.229 port 57062
Jan 27 20:05:10 np0005598095.novalocal sshd-session[7588]: Connection closed by invalid user admin 62.106.95.229 port 57062 [preauth]
Jan 27 20:05:19 np0005598095.novalocal sshd-session[7590]: Invalid user admin from 62.106.95.229 port 57066
Jan 27 20:05:19 np0005598095.novalocal sshd-session[7590]: Connection closed by invalid user admin 62.106.95.229 port 57066 [preauth]
Jan 27 20:05:22 np0005598095.novalocal sshd-session[7592]: Invalid user admin from 62.106.95.229 port 57072
Jan 27 20:05:22 np0005598095.novalocal sshd-session[7592]: Connection closed by invalid user admin 62.106.95.229 port 57072 [preauth]
Jan 27 20:05:26 np0005598095.novalocal sshd-session[7594]: Invalid user admin from 62.106.95.229 port 57078
Jan 27 20:05:26 np0005598095.novalocal sshd-session[7594]: Connection closed by invalid user admin 62.106.95.229 port 57078 [preauth]
Jan 27 20:05:31 np0005598095.novalocal sshd-session[7596]: Invalid user admin from 62.106.95.229 port 57084
Jan 27 20:05:31 np0005598095.novalocal sshd-session[7596]: Connection closed by invalid user admin 62.106.95.229 port 57084 [preauth]
Jan 27 20:05:33 np0005598095.novalocal sshd-session[7599]: Invalid user admin from 62.106.95.229 port 57086
Jan 27 20:05:33 np0005598095.novalocal sshd-session[7599]: Connection closed by invalid user admin 62.106.95.229 port 57086 [preauth]
Jan 27 20:05:35 np0005598095.novalocal sshd-session[7601]: Invalid user admin from 62.106.95.229 port 57088
Jan 27 20:05:36 np0005598095.novalocal sshd-session[7601]: Connection closed by invalid user admin 62.106.95.229 port 57088 [preauth]
Jan 27 20:05:39 np0005598095.novalocal sshd-session[7603]: Invalid user admin from 62.106.95.229 port 57092
Jan 27 20:05:39 np0005598095.novalocal sshd-session[7603]: Connection closed by invalid user admin 62.106.95.229 port 57092 [preauth]
Jan 27 20:05:42 np0005598095.novalocal sshd-session[7605]: Invalid user admin from 62.106.95.229 port 57094
Jan 27 20:05:42 np0005598095.novalocal sshd-session[7605]: Connection closed by invalid user admin 62.106.95.229 port 57094 [preauth]
Jan 27 20:05:43 np0005598095.novalocal sshd-session[7607]: Invalid user admin from 62.106.95.229 port 57102
Jan 27 20:05:44 np0005598095.novalocal sshd-session[7607]: Connection closed by invalid user admin 62.106.95.229 port 57102 [preauth]
Jan 27 20:05:45 np0005598095.novalocal sshd-session[7609]: Invalid user admin from 62.106.95.229 port 57106
Jan 27 20:05:45 np0005598095.novalocal sshd-session[7609]: Connection closed by invalid user admin 62.106.95.229 port 57106 [preauth]
Jan 27 20:05:48 np0005598095.novalocal sshd-session[7611]: Invalid user admin from 62.106.95.229 port 57108
Jan 27 20:05:48 np0005598095.novalocal sshd-session[7611]: Connection closed by invalid user admin 62.106.95.229 port 57108 [preauth]
Jan 27 20:05:55 np0005598095.novalocal sshd-session[7613]: Invalid user admin from 62.106.95.229 port 57114
Jan 27 20:05:56 np0005598095.novalocal sshd-session[7613]: Connection closed by invalid user admin 62.106.95.229 port 57114 [preauth]
Jan 27 20:05:58 np0005598095.novalocal sshd-session[7615]: Invalid user admin from 62.106.95.229 port 57126
Jan 27 20:05:58 np0005598095.novalocal sshd-session[7615]: Connection closed by invalid user admin 62.106.95.229 port 57126 [preauth]
Jan 27 20:06:00 np0005598095.novalocal sshd-session[7617]: Invalid user admin from 62.106.95.229 port 57130
Jan 27 20:06:01 np0005598095.novalocal sshd-session[7617]: Connection closed by invalid user admin 62.106.95.229 port 57130 [preauth]
Jan 27 20:06:03 np0005598095.novalocal sshd-session[7619]: Invalid user admin from 62.106.95.229 port 57136
Jan 27 20:06:03 np0005598095.novalocal sshd-session[7619]: Connection closed by invalid user admin 62.106.95.229 port 57136 [preauth]
Jan 27 20:06:05 np0005598095.novalocal sshd-session[7621]: Invalid user admin from 62.106.95.229 port 57138
Jan 27 20:06:05 np0005598095.novalocal sshd-session[7621]: Connection closed by invalid user admin 62.106.95.229 port 57138 [preauth]
Jan 27 20:06:07 np0005598095.novalocal sshd-session[7623]: Invalid user admin from 62.106.95.229 port 57144
Jan 27 20:06:07 np0005598095.novalocal sshd-session[7623]: Connection closed by invalid user admin 62.106.95.229 port 57144 [preauth]
Jan 27 20:06:08 np0005598095.novalocal sshd-session[7625]: Invalid user admin from 62.106.95.229 port 57146
Jan 27 20:06:08 np0005598095.novalocal sshd-session[7625]: Connection closed by invalid user admin 62.106.95.229 port 57146 [preauth]
Jan 27 20:06:09 np0005598095.novalocal sshd-session[7627]: Invalid user admin from 62.106.95.229 port 57148
Jan 27 20:06:09 np0005598095.novalocal sshd-session[7627]: Connection closed by invalid user admin 62.106.95.229 port 57148 [preauth]
Jan 27 20:06:12 np0005598095.novalocal sshd-session[7629]: Invalid user admin from 62.106.95.229 port 57150
Jan 27 20:06:12 np0005598095.novalocal sshd-session[7629]: Connection closed by invalid user admin 62.106.95.229 port 57150 [preauth]
Jan 27 20:06:14 np0005598095.novalocal sshd-session[7631]: Invalid user admin from 62.106.95.229 port 57156
Jan 27 20:06:14 np0005598095.novalocal sshd-session[7631]: Connection closed by invalid user admin 62.106.95.229 port 57156 [preauth]
Jan 27 20:06:19 np0005598095.novalocal sshd-session[7633]: Invalid user admin from 62.106.95.229 port 57164
Jan 27 20:06:19 np0005598095.novalocal sshd-session[7633]: Connection closed by invalid user admin 62.106.95.229 port 57164 [preauth]
Jan 27 20:06:23 np0005598095.novalocal sshd-session[7635]: Invalid user admin from 62.106.95.229 port 57168
Jan 27 20:06:24 np0005598095.novalocal sshd-session[7635]: Connection closed by invalid user admin 62.106.95.229 port 57168 [preauth]
Jan 27 20:06:24 np0005598095.novalocal sshd-session[7637]: Invalid user admin from 62.106.95.229 port 57174
Jan 27 20:06:25 np0005598095.novalocal sshd-session[7637]: Connection closed by invalid user admin 62.106.95.229 port 57174 [preauth]
Jan 27 20:06:26 np0005598095.novalocal sshd-session[7639]: Invalid user admin from 62.106.95.229 port 57178
Jan 27 20:06:26 np0005598095.novalocal sshd-session[7639]: Connection closed by invalid user admin 62.106.95.229 port 57178 [preauth]
Jan 27 20:06:29 np0005598095.novalocal sshd-session[7641]: Invalid user admin from 62.106.95.229 port 57180
Jan 27 20:06:29 np0005598095.novalocal sshd-session[7641]: Connection closed by invalid user admin 62.106.95.229 port 57180 [preauth]
Jan 27 20:06:30 np0005598095.novalocal sshd-session[7643]: Invalid user admin from 62.106.95.229 port 57186
Jan 27 20:06:31 np0005598095.novalocal sshd-session[7643]: Connection closed by invalid user admin 62.106.95.229 port 57186 [preauth]
Jan 27 20:06:32 np0005598095.novalocal sshd-session[7645]: Invalid user admin from 62.106.95.229 port 57188
Jan 27 20:06:32 np0005598095.novalocal sshd-session[7645]: Connection closed by invalid user admin 62.106.95.229 port 57188 [preauth]
Jan 27 20:06:33 np0005598095.novalocal sshd-session[7647]: Invalid user admin from 62.106.95.229 port 57190
Jan 27 20:06:33 np0005598095.novalocal sshd-session[7647]: Connection closed by invalid user admin 62.106.95.229 port 57190 [preauth]
Jan 27 20:06:35 np0005598095.novalocal sshd-session[7649]: Invalid user admin from 62.106.95.229 port 57192
Jan 27 20:06:36 np0005598095.novalocal sshd-session[7649]: Connection closed by invalid user admin 62.106.95.229 port 57192 [preauth]
Jan 27 20:06:46 np0005598095.novalocal sshd-session[7651]: Invalid user admin from 62.106.95.229 port 57194
Jan 27 20:06:47 np0005598095.novalocal sshd-session[7651]: Connection closed by invalid user admin 62.106.95.229 port 57194 [preauth]
Jan 27 20:06:49 np0005598095.novalocal sshd-session[7653]: Invalid user admin from 62.106.95.229 port 57208
Jan 27 20:06:49 np0005598095.novalocal sshd-session[7653]: Connection closed by invalid user admin 62.106.95.229 port 57208 [preauth]
Jan 27 20:06:51 np0005598095.novalocal sshd-session[7655]: Invalid user admin from 62.106.95.229 port 57210
Jan 27 20:06:51 np0005598095.novalocal sshd-session[7655]: Connection closed by invalid user admin 62.106.95.229 port 57210 [preauth]
Jan 27 20:06:54 np0005598095.novalocal sshd-session[7657]: Invalid user admin from 62.106.95.229 port 57212
Jan 27 20:06:54 np0005598095.novalocal sshd-session[7657]: Connection closed by invalid user admin 62.106.95.229 port 57212 [preauth]
Jan 27 20:06:56 np0005598095.novalocal sshd-session[7659]: Invalid user admin from 62.106.95.229 port 57220
Jan 27 20:06:57 np0005598095.novalocal sshd-session[7659]: Connection closed by invalid user admin 62.106.95.229 port 57220 [preauth]
Jan 27 20:06:58 np0005598095.novalocal sshd-session[7661]: Invalid user admin from 62.106.95.229 port 57224
Jan 27 20:06:58 np0005598095.novalocal sshd-session[7661]: Connection closed by invalid user admin 62.106.95.229 port 57224 [preauth]
Jan 27 20:07:03 np0005598095.novalocal sshd-session[7663]: Invalid user admin from 62.106.95.229 port 57226
Jan 27 20:07:03 np0005598095.novalocal sshd-session[7663]: Connection closed by invalid user admin 62.106.95.229 port 57226 [preauth]
Jan 27 20:07:06 np0005598095.novalocal sshd-session[7665]: Invalid user admin from 62.106.95.229 port 57234
Jan 27 20:07:06 np0005598095.novalocal sshd-session[7665]: Connection closed by invalid user admin 62.106.95.229 port 57234 [preauth]
Jan 27 20:07:07 np0005598095.novalocal sshd-session[7667]: Invalid user admin from 62.106.95.229 port 57240
Jan 27 20:07:07 np0005598095.novalocal sshd-session[7667]: Connection closed by invalid user admin 62.106.95.229 port 57240 [preauth]
Jan 27 20:07:09 np0005598095.novalocal sshd-session[7669]: Invalid user admin from 62.106.95.229 port 57242
Jan 27 20:07:10 np0005598095.novalocal sshd-session[7669]: Connection closed by invalid user admin 62.106.95.229 port 57242 [preauth]
Jan 27 20:07:13 np0005598095.novalocal sshd-session[7671]: Invalid user admin from 62.106.95.229 port 57248
Jan 27 20:07:13 np0005598095.novalocal sshd-session[7671]: Connection closed by invalid user admin 62.106.95.229 port 57248 [preauth]
Jan 27 20:07:14 np0005598095.novalocal sshd-session[7673]: Invalid user admin from 62.106.95.229 port 57252
Jan 27 20:07:15 np0005598095.novalocal sshd-session[7673]: Connection closed by invalid user admin 62.106.95.229 port 57252 [preauth]
Jan 27 20:07:18 np0005598095.novalocal sshd-session[7675]: Invalid user admin from 62.106.95.229 port 57256
Jan 27 20:07:19 np0005598095.novalocal sshd-session[7675]: Connection closed by invalid user admin 62.106.95.229 port 57256 [preauth]
Jan 27 20:07:40 np0005598095.novalocal sshd-session[7677]: Invalid user admin from 62.106.95.229 port 57264
Jan 27 20:07:40 np0005598095.novalocal sshd-session[7677]: Connection closed by invalid user admin 62.106.95.229 port 57264 [preauth]
Jan 27 20:07:42 np0005598095.novalocal sshd-session[7679]: Invalid user admin from 62.106.95.229 port 57288
Jan 27 20:07:42 np0005598095.novalocal sshd-session[7679]: Connection closed by invalid user admin 62.106.95.229 port 57288 [preauth]
Jan 27 20:07:42 np0005598095.novalocal sshd-session[7681]: Invalid user ubuntu from 80.94.92.186 port 54202
Jan 27 20:07:42 np0005598095.novalocal sshd-session[7681]: Connection closed by invalid user ubuntu 80.94.92.186 port 54202 [preauth]
Jan 27 20:07:45 np0005598095.novalocal sshd-session[7683]: Invalid user admin from 62.106.95.229 port 57290
Jan 27 20:07:46 np0005598095.novalocal sshd-session[7683]: Connection closed by invalid user admin 62.106.95.229 port 57290 [preauth]
Jan 27 20:07:49 np0005598095.novalocal systemd[1]: Starting dnf makecache...
Jan 27 20:07:49 np0005598095.novalocal sshd-session[7685]: Invalid user admin from 62.106.95.229 port 57294
Jan 27 20:07:49 np0005598095.novalocal dnf[7687]: Failed determining last makecache time.
Jan 27 20:07:50 np0005598095.novalocal dnf[7687]: CentOS Stream 9 - BaseOS                         24 kB/s | 6.7 kB     00:00
Jan 27 20:07:50 np0005598095.novalocal dnf[7687]: CentOS Stream 9 - AppStream                      62 kB/s | 6.8 kB     00:00
Jan 27 20:07:50 np0005598095.novalocal sshd-session[7685]: Connection closed by invalid user admin 62.106.95.229 port 57294 [preauth]
Jan 27 20:07:50 np0005598095.novalocal dnf[7687]: CentOS Stream 9 - CRB                            56 kB/s | 6.6 kB     00:00
Jan 27 20:07:51 np0005598095.novalocal dnf[7687]: CentOS Stream 9 - Extras packages                59 kB/s | 7.3 kB     00:00
Jan 27 20:07:51 np0005598095.novalocal dnf[7687]: Metadata cache created.
Jan 27 20:07:51 np0005598095.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 27 20:07:51 np0005598095.novalocal systemd[1]: Finished dnf makecache.
Jan 27 20:07:55 np0005598095.novalocal sshd-session[7694]: Invalid user admin from 62.106.95.229 port 57302
Jan 27 20:07:56 np0005598095.novalocal sshd-session[7694]: Connection closed by invalid user admin 62.106.95.229 port 57302 [preauth]
Jan 27 20:07:57 np0005598095.novalocal sshd-session[7698]: Invalid user admin from 62.106.95.229 port 57308
Jan 27 20:07:57 np0005598095.novalocal sshd-session[7698]: Connection closed by invalid user admin 62.106.95.229 port 57308 [preauth]
Jan 27 20:08:00 np0005598095.novalocal sshd-session[7700]: Invalid user admin from 62.106.95.229 port 57310
Jan 27 20:08:01 np0005598095.novalocal sshd-session[7700]: Connection closed by invalid user admin 62.106.95.229 port 57310 [preauth]
Jan 27 20:08:03 np0005598095.novalocal sshd-session[7702]: Invalid user admin from 62.106.95.229 port 57320
Jan 27 20:08:04 np0005598095.novalocal sshd-session[7702]: Connection closed by invalid user admin 62.106.95.229 port 57320 [preauth]
Jan 27 20:08:07 np0005598095.novalocal sshd-session[7704]: Invalid user admin from 62.106.95.229 port 57328
Jan 27 20:08:07 np0005598095.novalocal sshd-session[7704]: Connection closed by invalid user admin 62.106.95.229 port 57328 [preauth]
Jan 27 20:08:10 np0005598095.novalocal sshd-session[7706]: Invalid user admin from 62.106.95.229 port 57350
Jan 27 20:08:10 np0005598095.novalocal sshd-session[7706]: Connection closed by invalid user admin 62.106.95.229 port 57350 [preauth]
Jan 27 20:08:11 np0005598095.novalocal sshd-session[7708]: Invalid user admin from 62.106.95.229 port 57372
Jan 27 20:08:13 np0005598095.novalocal sshd-session[7708]: Connection closed by invalid user admin 62.106.95.229 port 57372 [preauth]
Jan 27 20:08:13 np0005598095.novalocal sshd-session[7710]: Invalid user admin from 62.106.95.229 port 57392
Jan 27 20:08:14 np0005598095.novalocal sshd-session[7710]: Connection closed by invalid user admin 62.106.95.229 port 57392 [preauth]
Jan 27 20:08:16 np0005598095.novalocal sshd-session[7712]: Invalid user admin from 62.106.95.229 port 57398
Jan 27 20:08:16 np0005598095.novalocal sshd-session[7712]: Connection closed by invalid user admin 62.106.95.229 port 57398 [preauth]
Jan 27 20:08:18 np0005598095.novalocal sshd-session[7714]: Invalid user admin from 62.106.95.229 port 57410
Jan 27 20:08:19 np0005598095.novalocal sshd-session[7714]: Connection closed by invalid user admin 62.106.95.229 port 57410 [preauth]
Jan 27 20:08:23 np0005598095.novalocal sshd-session[7716]: Invalid user admin from 62.106.95.229 port 57426
Jan 27 20:08:23 np0005598095.novalocal sshd-session[7716]: Connection closed by invalid user admin 62.106.95.229 port 57426 [preauth]
Jan 27 20:08:25 np0005598095.novalocal sshd-session[7718]: Invalid user admin from 62.106.95.229 port 57446
Jan 27 20:08:25 np0005598095.novalocal sshd-session[7718]: Connection closed by invalid user admin 62.106.95.229 port 57446 [preauth]
Jan 27 20:08:29 np0005598095.novalocal sshd-session[7720]: Invalid user admin from 62.106.95.229 port 57460
Jan 27 20:08:31 np0005598095.novalocal sshd-session[7720]: Connection closed by invalid user admin 62.106.95.229 port 57460 [preauth]
Jan 27 20:08:32 np0005598095.novalocal sshd-session[7722]: Invalid user admin from 62.106.95.229 port 57476
Jan 27 20:08:32 np0005598095.novalocal sshd-session[7722]: Connection closed by invalid user admin 62.106.95.229 port 57476 [preauth]
Jan 27 20:08:33 np0005598095.novalocal sshd-session[7724]: Invalid user admin from 62.106.95.229 port 57478
Jan 27 20:08:34 np0005598095.novalocal sshd-session[7724]: Connection closed by invalid user admin 62.106.95.229 port 57478 [preauth]
Jan 27 20:08:38 np0005598095.novalocal sshd-session[7726]: Invalid user admin from 62.106.95.229 port 57480
Jan 27 20:08:39 np0005598095.novalocal sshd-session[7726]: Connection closed by invalid user admin 62.106.95.229 port 57480 [preauth]
Jan 27 20:08:46 np0005598095.novalocal sshd-session[7728]: Invalid user admin from 62.106.95.229 port 57514
Jan 27 20:08:46 np0005598095.novalocal sshd-session[7728]: Connection closed by invalid user admin 62.106.95.229 port 57514 [preauth]
Jan 27 20:08:51 np0005598095.novalocal sshd-session[7730]: Invalid user admin from 62.106.95.229 port 57542
Jan 27 20:08:52 np0005598095.novalocal sshd-session[7730]: Connection closed by invalid user admin 62.106.95.229 port 57542 [preauth]
Jan 27 20:08:55 np0005598095.novalocal sshd-session[7732]: Invalid user admin from 62.106.95.229 port 57562
Jan 27 20:08:55 np0005598095.novalocal sshd-session[7732]: Connection closed by invalid user admin 62.106.95.229 port 57562 [preauth]
Jan 27 20:09:01 np0005598095.novalocal anacron[7288]: Job `cron.daily' started
Jan 27 20:09:01 np0005598095.novalocal anacron[7288]: Job `cron.daily' terminated
Jan 27 20:09:01 np0005598095.novalocal sshd-session[7734]: Invalid user admin from 62.106.95.229 port 57566
Jan 27 20:09:02 np0005598095.novalocal sshd-session[7734]: Connection closed by invalid user admin 62.106.95.229 port 57566 [preauth]
Jan 27 20:09:03 np0005598095.novalocal sshd-session[7738]: Invalid user admin from 62.106.95.229 port 57578
Jan 27 20:09:05 np0005598095.novalocal sshd-session[7738]: Connection closed by invalid user admin 62.106.95.229 port 57578 [preauth]
Jan 27 20:09:06 np0005598095.novalocal sshd-session[7740]: Invalid user admin from 62.106.95.229 port 57588
Jan 27 20:09:07 np0005598095.novalocal sshd-session[7740]: Connection closed by invalid user admin 62.106.95.229 port 57588 [preauth]
Jan 27 20:09:10 np0005598095.novalocal sshd-session[7742]: Invalid user admin from 62.106.95.229 port 57592
Jan 27 20:09:10 np0005598095.novalocal sshd-session[7742]: Connection closed by invalid user admin 62.106.95.229 port 57592 [preauth]
Jan 27 20:09:12 np0005598095.novalocal sshd-session[7744]: Invalid user admin from 62.106.95.229 port 57598
Jan 27 20:09:12 np0005598095.novalocal sshd-session[7744]: Connection closed by invalid user admin 62.106.95.229 port 57598 [preauth]
Jan 27 20:09:16 np0005598095.novalocal sshd-session[7746]: Invalid user pi from 62.106.95.229 port 57602
Jan 27 20:09:16 np0005598095.novalocal sshd-session[7746]: Connection closed by invalid user pi 62.106.95.229 port 57602 [preauth]
Jan 27 20:09:21 np0005598095.novalocal sshd-session[7748]: Connection closed by authenticating user ftp 62.106.95.229 port 57632 [preauth]
Jan 27 20:09:22 np0005598095.novalocal sshd-session[7751]: Accepted publickey for zuul from 38.102.83.114 port 46112 ssh2: RSA SHA256:KbiQ7dOB9mL82DEiBFdAeiKAgIiBJoqnrsw9aytL3+4
Jan 27 20:09:22 np0005598095.novalocal systemd-logind[786]: New session 5 of user zuul.
Jan 27 20:09:22 np0005598095.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 27 20:09:22 np0005598095.novalocal sshd-session[7751]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:09:22 np0005598095.novalocal sudo[7778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rruecaywhzohhkyacntoygpdetusgtru ; /usr/bin/python3'
Jan 27 20:09:22 np0005598095.novalocal sudo[7778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:23 np0005598095.novalocal python3[7780]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-2bde-145a-000000002176-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:09:23 np0005598095.novalocal sudo[7778]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:23 np0005598095.novalocal sudo[7806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sulfmotxchkkqbdnsrpahziujgaqwilz ; /usr/bin/python3'
Jan 27 20:09:23 np0005598095.novalocal sudo[7806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:23 np0005598095.novalocal python3[7808]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:09:23 np0005598095.novalocal sudo[7806]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:23 np0005598095.novalocal sudo[7833]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blvmldhkcpzlywkzyclohqpbwthikzke ; /usr/bin/python3'
Jan 27 20:09:23 np0005598095.novalocal sudo[7833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:23 np0005598095.novalocal python3[7835]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:09:23 np0005598095.novalocal sudo[7833]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:23 np0005598095.novalocal sudo[7859]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pusuyqgzasvezebwwsjebpbwpinskqzm ; /usr/bin/python3'
Jan 27 20:09:23 np0005598095.novalocal sudo[7859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:24 np0005598095.novalocal python3[7861]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:09:24 np0005598095.novalocal sudo[7859]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:24 np0005598095.novalocal sudo[7885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lupbqheqdzifiwshmqydcegygfhngcvr ; /usr/bin/python3'
Jan 27 20:09:24 np0005598095.novalocal sudo[7885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:24 np0005598095.novalocal python3[7887]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:09:24 np0005598095.novalocal sudo[7885]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:24 np0005598095.novalocal sudo[7911]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbemxyjmdrufpctrcndwcozxzbjcpwa ; /usr/bin/python3'
Jan 27 20:09:24 np0005598095.novalocal sudo[7911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:24 np0005598095.novalocal python3[7913]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:09:24 np0005598095.novalocal sudo[7911]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:25 np0005598095.novalocal sudo[7989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nftltlsqpoplxtspsrqwkniakpyxmgan ; /usr/bin/python3'
Jan 27 20:09:25 np0005598095.novalocal sudo[7989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:25 np0005598095.novalocal python3[7991]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:09:25 np0005598095.novalocal sudo[7989]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:25 np0005598095.novalocal sudo[8062]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzbprvhlbuuthyhrhcndixosvqacznqh ; /usr/bin/python3'
Jan 27 20:09:25 np0005598095.novalocal sudo[8062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:25 np0005598095.novalocal python3[8064]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769544565.0422616-519-147725826684353/source _original_basename=tmplfwfxz4u follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:09:25 np0005598095.novalocal sudo[8062]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:26 np0005598095.novalocal sudo[8112]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohtbzkfcrxzysqqyzgpxxikqnaybnvqz ; /usr/bin/python3'
Jan 27 20:09:26 np0005598095.novalocal sudo[8112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:26 np0005598095.novalocal python3[8114]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 20:09:26 np0005598095.novalocal systemd[1]: Reloading.
Jan 27 20:09:26 np0005598095.novalocal systemd-rc-local-generator[8135]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:09:26 np0005598095.novalocal sudo[8112]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:28 np0005598095.novalocal sudo[8168]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekcpriiheklujzofjtddtifadioufqdy ; /usr/bin/python3'
Jan 27 20:09:28 np0005598095.novalocal sudo[8168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:28 np0005598095.novalocal python3[8170]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 27 20:09:28 np0005598095.novalocal sudo[8168]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:28 np0005598095.novalocal sudo[8194]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jabsuqbocapbzyxvmvyykqueiuciywyq ; /usr/bin/python3'
Jan 27 20:09:28 np0005598095.novalocal sudo[8194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:28 np0005598095.novalocal python3[8196]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:09:28 np0005598095.novalocal sudo[8194]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:28 np0005598095.novalocal sudo[8222]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnhwdsxntdyzsqlurlkgncbxdcikcjys ; /usr/bin/python3'
Jan 27 20:09:28 np0005598095.novalocal sudo[8222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:28 np0005598095.novalocal python3[8224]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:09:28 np0005598095.novalocal sudo[8222]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:29 np0005598095.novalocal sudo[8250]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpyqcbizskiarentjycmlplaiwcwizrq ; /usr/bin/python3'
Jan 27 20:09:29 np0005598095.novalocal sudo[8250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:29 np0005598095.novalocal python3[8252]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:09:29 np0005598095.novalocal sudo[8250]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:29 np0005598095.novalocal sudo[8278]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-objlhhphtckobpqpdtbfuhbpoypukypr ; /usr/bin/python3'
Jan 27 20:09:29 np0005598095.novalocal sudo[8278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:29 np0005598095.novalocal python3[8280]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:09:29 np0005598095.novalocal sudo[8278]: pam_unix(sudo:session): session closed for user root
Jan 27 20:09:30 np0005598095.novalocal python3[8307]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-2bde-145a-00000000217d-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:09:30 np0005598095.novalocal python3[8337]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 27 20:09:33 np0005598095.novalocal sshd-session[7754]: Connection closed by 38.102.83.114 port 46112
Jan 27 20:09:33 np0005598095.novalocal sshd-session[7751]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:09:33 np0005598095.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 27 20:09:33 np0005598095.novalocal systemd[1]: session-5.scope: Consumed 4.220s CPU time.
Jan 27 20:09:33 np0005598095.novalocal systemd-logind[786]: Session 5 logged out. Waiting for processes to exit.
Jan 27 20:09:33 np0005598095.novalocal systemd-logind[786]: Removed session 5.
Jan 27 20:09:35 np0005598095.novalocal sshd-session[8343]: Accepted publickey for zuul from 38.102.83.114 port 57452 ssh2: RSA SHA256:KbiQ7dOB9mL82DEiBFdAeiKAgIiBJoqnrsw9aytL3+4
Jan 27 20:09:35 np0005598095.novalocal systemd-logind[786]: New session 6 of user zuul.
Jan 27 20:09:35 np0005598095.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 27 20:09:35 np0005598095.novalocal sshd-session[8343]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:09:35 np0005598095.novalocal sudo[8370]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqnutgwwnoyosefhxwtnwinkezlmknnz ; /usr/bin/python3'
Jan 27 20:09:35 np0005598095.novalocal sudo[8370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:09:35 np0005598095.novalocal python3[8372]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 27 20:09:42 np0005598095.novalocal setsebool[8408]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 27 20:09:42 np0005598095.novalocal setsebool[8408]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  Converting 387 SID table entries...
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 20:09:57 np0005598095.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  Converting 390 SID table entries...
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 20:10:07 np0005598095.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 20:10:26 np0005598095.novalocal dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 27 20:10:26 np0005598095.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 20:10:26 np0005598095.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 27 20:10:26 np0005598095.novalocal systemd[1]: Reloading.
Jan 27 20:10:26 np0005598095.novalocal systemd-rc-local-generator[9174]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:10:26 np0005598095.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 20:10:27 np0005598095.novalocal sudo[8370]: pam_unix(sudo:session): session closed for user root
Jan 27 20:10:34 np0005598095.novalocal python3[14664]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-ab06-807e-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:10:35 np0005598095.novalocal kernel: evm: overlay not supported
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: Starting D-Bus User Message Bus...
Jan 27 20:10:35 np0005598095.novalocal dbus-broker-launch[15169]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 27 20:10:35 np0005598095.novalocal dbus-broker-launch[15169]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: Started D-Bus User Message Bus.
Jan 27 20:10:35 np0005598095.novalocal dbus-broker-lau[15169]: Ready
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: Created slice Slice /user.
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: podman-15099.scope: unit configures an IP firewall, but not running as root.
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: (This warning is only shown for the first unit using IP firewalling.)
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: Started podman-15099.scope.
Jan 27 20:10:35 np0005598095.novalocal systemd[4312]: Started podman-pause-23aa23a3.scope.
Jan 27 20:10:36 np0005598095.novalocal sudo[15670]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkytihaeacdwrchbmppblrmvqbixreeh ; /usr/bin/python3'
Jan 27 20:10:36 np0005598095.novalocal sudo[15670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:10:36 np0005598095.novalocal python3[15685]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.195:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.195:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:10:36 np0005598095.novalocal python3[15685]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 27 20:10:36 np0005598095.novalocal sudo[15670]: pam_unix(sudo:session): session closed for user root
Jan 27 20:10:37 np0005598095.novalocal sshd-session[8346]: Connection closed by 38.102.83.114 port 57452
Jan 27 20:10:37 np0005598095.novalocal sshd-session[8343]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:10:37 np0005598095.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Jan 27 20:10:37 np0005598095.novalocal systemd[1]: session-6.scope: Consumed 47.152s CPU time.
Jan 27 20:10:37 np0005598095.novalocal systemd-logind[786]: Session 6 logged out. Waiting for processes to exit.
Jan 27 20:10:37 np0005598095.novalocal systemd-logind[786]: Removed session 6.
Jan 27 20:10:55 np0005598095.novalocal sshd-session[23554]: Connection closed by 38.129.56.165 port 59470 [preauth]
Jan 27 20:10:55 np0005598095.novalocal sshd-session[23558]: Connection closed by 38.129.56.165 port 59454 [preauth]
Jan 27 20:10:55 np0005598095.novalocal sshd-session[23561]: Unable to negotiate with 38.129.56.165 port 59480: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 27 20:10:55 np0005598095.novalocal sshd-session[23556]: Unable to negotiate with 38.129.56.165 port 59504: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 27 20:10:56 np0005598095.novalocal sshd-session[23560]: Unable to negotiate with 38.129.56.165 port 59492: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 27 20:11:00 np0005598095.novalocal sshd-session[25408]: Accepted publickey for zuul from 38.102.83.114 port 53476 ssh2: RSA SHA256:KbiQ7dOB9mL82DEiBFdAeiKAgIiBJoqnrsw9aytL3+4
Jan 27 20:11:00 np0005598095.novalocal systemd-logind[786]: New session 7 of user zuul.
Jan 27 20:11:00 np0005598095.novalocal systemd[1]: Started Session 7 of User zuul.
Jan 27 20:11:00 np0005598095.novalocal sshd-session[25408]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:11:00 np0005598095.novalocal python3[25532]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBGPgqfWRtx+db17nGeWyc9hXFjaW/S++pjhZ/AWRWTBvONUTXNRrF8jj5fZTr+prGzSenpZs/9fIRsaWz45NGc= zuul@np0005598093.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 20:11:01 np0005598095.novalocal sudo[25770]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bahtrlkbxwdhrlgeytopubgwzrqxpqql ; /usr/bin/python3'
Jan 27 20:11:01 np0005598095.novalocal sudo[25770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:11:01 np0005598095.novalocal python3[25781]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBGPgqfWRtx+db17nGeWyc9hXFjaW/S++pjhZ/AWRWTBvONUTXNRrF8jj5fZTr+prGzSenpZs/9fIRsaWz45NGc= zuul@np0005598093.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 20:11:01 np0005598095.novalocal sudo[25770]: pam_unix(sudo:session): session closed for user root
Jan 27 20:11:01 np0005598095.novalocal sudo[26247]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqyepboobpcvmqgohxmkvwxhkvthwrun ; /usr/bin/python3'
Jan 27 20:11:01 np0005598095.novalocal sudo[26247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:11:02 np0005598095.novalocal python3[26257]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005598095.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 27 20:11:02 np0005598095.novalocal useradd[26328]: new group: name=cloud-admin, GID=1002
Jan 27 20:11:02 np0005598095.novalocal useradd[26328]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 27 20:11:02 np0005598095.novalocal sudo[26247]: pam_unix(sudo:session): session closed for user root
Jan 27 20:11:05 np0005598095.novalocal sudo[27723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtdvavpojipngpasygwrnctsqmgtqyiq ; /usr/bin/python3'
Jan 27 20:11:05 np0005598095.novalocal sudo[27723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:11:05 np0005598095.novalocal python3[27733]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBGPgqfWRtx+db17nGeWyc9hXFjaW/S++pjhZ/AWRWTBvONUTXNRrF8jj5fZTr+prGzSenpZs/9fIRsaWz45NGc= zuul@np0005598093.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 27 20:11:05 np0005598095.novalocal sudo[27723]: pam_unix(sudo:session): session closed for user root
Jan 27 20:11:05 np0005598095.novalocal sudo[27995]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlgztzhgnqhfgijslbsnvigyhoqkubgs ; /usr/bin/python3'
Jan 27 20:11:05 np0005598095.novalocal sudo[27995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:11:05 np0005598095.novalocal python3[28002]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:11:05 np0005598095.novalocal sudo[27995]: pam_unix(sudo:session): session closed for user root
Jan 27 20:11:06 np0005598095.novalocal sudo[28237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbvuykwwhtmntcfbeilshdlsaolgqvje ; /usr/bin/python3'
Jan 27 20:11:06 np0005598095.novalocal sudo[28237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:11:06 np0005598095.novalocal python3[28246]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769544665.6276293-152-9638258419448/source _original_basename=tmptr2oom5x follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:11:06 np0005598095.novalocal sudo[28237]: pam_unix(sudo:session): session closed for user root
Jan 27 20:11:06 np0005598095.novalocal sudo[28576]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivjtlhxibqlsdjcdgkwbwoqlsylecvhc ; /usr/bin/python3'
Jan 27 20:11:06 np0005598095.novalocal sudo[28576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:11:07 np0005598095.novalocal python3[28587]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 27 20:11:07 np0005598095.novalocal systemd[1]: Starting Hostname Service...
Jan 27 20:11:07 np0005598095.novalocal systemd[1]: Started Hostname Service.
Jan 27 20:11:07 np0005598095.novalocal systemd-hostnamed[28706]: Changed pretty hostname to 'compute-1'
Jan 27 20:11:07 compute-1 systemd-hostnamed[28706]: Hostname set to <compute-1> (static)
Jan 27 20:11:07 compute-1 NetworkManager[7199]: <info>  [1769544667.3783] hostname: static hostname changed from "np0005598095.novalocal" to "compute-1"
Jan 27 20:11:07 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 27 20:11:07 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 27 20:11:07 compute-1 sudo[28576]: pam_unix(sudo:session): session closed for user root
Jan 27 20:11:08 compute-1 sshd-session[25465]: Connection closed by 38.102.83.114 port 53476
Jan 27 20:11:08 compute-1 sshd-session[25408]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:11:08 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Jan 27 20:11:08 compute-1 systemd[1]: session-7.scope: Consumed 2.208s CPU time.
Jan 27 20:11:08 compute-1 systemd-logind[786]: Session 7 logged out. Waiting for processes to exit.
Jan 27 20:11:08 compute-1 systemd-logind[786]: Removed session 7.
Jan 27 20:11:11 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 20:11:11 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 20:11:11 compute-1 systemd[1]: man-db-cache-update.service: Consumed 53.480s CPU time.
Jan 27 20:11:11 compute-1 systemd[1]: run-r7b02cc5f29a34ac7a5ce1e2cf6ddb39d.service: Deactivated successfully.
Jan 27 20:11:17 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 27 20:11:37 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 27 20:12:24 compute-1 sshd-session[30193]: Invalid user ubuntu from 80.94.92.186 port 57230
Jan 27 20:12:24 compute-1 sshd-session[30193]: Connection closed by invalid user ubuntu 80.94.92.186 port 57230 [preauth]
Jan 27 20:12:40 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 27 20:12:40 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 27 20:12:40 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 27 20:12:40 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 27 20:16:04 compute-1 sshd-session[30200]: Accepted publickey for zuul from 38.129.56.165 port 55732 ssh2: RSA SHA256:KbiQ7dOB9mL82DEiBFdAeiKAgIiBJoqnrsw9aytL3+4
Jan 27 20:16:04 compute-1 systemd-logind[786]: New session 8 of user zuul.
Jan 27 20:16:04 compute-1 systemd[1]: Started Session 8 of User zuul.
Jan 27 20:16:04 compute-1 sshd-session[30200]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:16:04 compute-1 python3[30276]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:16:06 compute-1 sudo[30390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsqhhakhycpafrqdpslfajoscqsinqes ; /usr/bin/python3'
Jan 27 20:16:06 compute-1 sudo[30390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:06 compute-1 python3[30392]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:06 compute-1 sudo[30390]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:06 compute-1 sudo[30463]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xszufwisaftthasvxednyakcydacuyjq ; /usr/bin/python3'
Jan 27 20:16:06 compute-1 sudo[30463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:07 compute-1 python3[30465]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=delorean.repo follow=False checksum=2e65f5781089f6db35f20eae2311859479a007a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:07 compute-1 sudo[30463]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:07 compute-1 sudo[30489]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fifacwzazbpcaarcgedqijkeklyewjip ; /usr/bin/python3'
Jan 27 20:16:07 compute-1 sudo[30489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:07 compute-1 python3[30491]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:07 compute-1 sudo[30489]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:07 compute-1 sudo[30562]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksmfcgybtrfpdzolvhfyhuraicpqurjd ; /usr/bin/python3'
Jan 27 20:16:07 compute-1 sudo[30562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:07 compute-1 python3[30564]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=2c5ad31b3cd5c5b96a9995d83e342833f9bd7020 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:07 compute-1 sudo[30562]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:07 compute-1 sudo[30588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fifppfsuteszzmdeyxqvchrsrodpwams ; /usr/bin/python3'
Jan 27 20:16:07 compute-1 sudo[30588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:07 compute-1 python3[30590]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:08 compute-1 sudo[30588]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:08 compute-1 sudo[30661]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jclpwydkbhwhesladxaujyhoejnyzrhk ; /usr/bin/python3'
Jan 27 20:16:08 compute-1 sudo[30661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:08 compute-1 python3[30663]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:08 compute-1 sudo[30661]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:08 compute-1 sudo[30687]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvohnconaloxynawdjlbaxnoxwkzjlpo ; /usr/bin/python3'
Jan 27 20:16:08 compute-1 sudo[30687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:08 compute-1 python3[30689]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:08 compute-1 sudo[30687]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:08 compute-1 sudo[30760]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vumbusffhdhibhdjcnviixwwwzumrqlk ; /usr/bin/python3'
Jan 27 20:16:08 compute-1 sudo[30760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:08 compute-1 python3[30762]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:08 compute-1 sudo[30760]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:09 compute-1 sudo[30786]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvzrwcvfiwkjnvwmeonqtjirhekczzpe ; /usr/bin/python3'
Jan 27 20:16:09 compute-1 sudo[30786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:09 compute-1 python3[30788]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:09 compute-1 sudo[30786]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:09 compute-1 sudo[30859]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdfgvozglwnbwdhlqhnzwfbscxyrzahn ; /usr/bin/python3'
Jan 27 20:16:09 compute-1 sudo[30859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:09 compute-1 python3[30861]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:09 compute-1 sudo[30859]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:09 compute-1 sudo[30885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbfinbvdqfnuexjjzevyrlgowfkzygbb ; /usr/bin/python3'
Jan 27 20:16:09 compute-1 sudo[30885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:09 compute-1 python3[30887]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:09 compute-1 sudo[30885]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:10 compute-1 sudo[30958]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvwewjvifcxrtcfmwmfufecyystgulrr ; /usr/bin/python3'
Jan 27 20:16:10 compute-1 sudo[30958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:10 compute-1 python3[30960]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:10 compute-1 sudo[30958]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:10 compute-1 sudo[30984]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjodunmoqtctnuqvivfbxovthvdedjhy ; /usr/bin/python3'
Jan 27 20:16:10 compute-1 sudo[30984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:10 compute-1 python3[30986]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:10 compute-1 sudo[30984]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:10 compute-1 sudo[31057]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpxqvrhigdjmxezasgglaxeeqlfcpoxk ; /usr/bin/python3'
Jan 27 20:16:10 compute-1 sudo[31057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:10 compute-1 python3[31059]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=aa03f96b62b2a238943efcc5a547883c212e7d56 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:10 compute-1 sudo[31057]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:10 compute-1 sudo[31083]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qujzgkltmdscrruwzxxegnkntoxffqoy ; /usr/bin/python3'
Jan 27 20:16:10 compute-1 sudo[31083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:11 compute-1 python3[31085]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 27 20:16:11 compute-1 sudo[31083]: pam_unix(sudo:session): session closed for user root
Jan 27 20:16:11 compute-1 sudo[31156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpaklxiisfhoprqjpodusjgnhlawgwpk ; /usr/bin/python3'
Jan 27 20:16:11 compute-1 sudo[31156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:16:11 compute-1 python3[31158]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769544966.283516-33984-122065682693905/source mode=0755 _original_basename=gating.repo follow=False checksum=cf52248e1fc0151405dddb44c111ab0b3a3e191e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:16:11 compute-1 sudo[31156]: pam_unix(sudo:session): session closed for user root
Jan 27 20:17:18 compute-1 sshd-session[31184]: Invalid user ubuntu from 80.94.92.186 port 60256
Jan 27 20:17:18 compute-1 sshd-session[31184]: Connection closed by invalid user ubuntu 80.94.92.186 port 60256 [preauth]
Jan 27 20:17:36 compute-1 python3[31209]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:22:11 compute-1 sshd-session[31213]: Invalid user sol from 80.94.92.186 port 35072
Jan 27 20:22:11 compute-1 sshd-session[31213]: Connection closed by invalid user sol 80.94.92.186 port 35072 [preauth]
Jan 27 20:22:36 compute-1 sshd-session[30203]: Received disconnect from 38.129.56.165 port 55732:11: disconnected by user
Jan 27 20:22:36 compute-1 sshd-session[30203]: Disconnected from user zuul 38.129.56.165 port 55732
Jan 27 20:22:36 compute-1 sshd-session[30200]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:22:36 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Jan 27 20:22:36 compute-1 systemd[1]: session-8.scope: Consumed 5.657s CPU time.
Jan 27 20:22:36 compute-1 systemd-logind[786]: Session 8 logged out. Waiting for processes to exit.
Jan 27 20:22:36 compute-1 systemd-logind[786]: Removed session 8.
Jan 27 20:26:53 compute-1 sshd-session[31218]: Invalid user sol from 80.94.92.186 port 38128
Jan 27 20:26:53 compute-1 sshd-session[31218]: Connection closed by invalid user sol 80.94.92.186 port 38128 [preauth]
Jan 27 20:29:01 compute-1 anacron[7288]: Job `cron.weekly' started
Jan 27 20:29:01 compute-1 anacron[7288]: Job `cron.weekly' terminated
Jan 27 20:31:36 compute-1 sshd-session[31224]: Invalid user sol from 80.94.92.186 port 41154
Jan 27 20:31:36 compute-1 sshd-session[31224]: Connection closed by invalid user sol 80.94.92.186 port 41154 [preauth]
Jan 27 20:36:08 compute-1 sshd-session[31228]: Invalid user sol from 80.94.92.186 port 44184
Jan 27 20:36:08 compute-1 sshd-session[31228]: Connection closed by invalid user sol 80.94.92.186 port 44184 [preauth]
Jan 27 20:40:45 compute-1 sshd-session[31233]: Invalid user solana from 80.94.92.186 port 47202
Jan 27 20:40:46 compute-1 sshd-session[31233]: Connection closed by invalid user solana 80.94.92.186 port 47202 [preauth]
Jan 27 20:45:18 compute-1 sshd-session[31236]: Invalid user solana from 80.94.92.186 port 50250
Jan 27 20:45:18 compute-1 sshd-session[31236]: Connection closed by invalid user solana 80.94.92.186 port 50250 [preauth]
Jan 27 20:49:01 compute-1 anacron[7288]: Job `cron.monthly' started
Jan 27 20:49:01 compute-1 anacron[7288]: Job `cron.monthly' terminated
Jan 27 20:49:01 compute-1 anacron[7288]: Normal exit (3 jobs run)
Jan 27 20:49:36 compute-1 sshd-session[31243]: Accepted publickey for zuul from 192.168.122.30 port 54714 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:49:36 compute-1 systemd-logind[786]: New session 9 of user zuul.
Jan 27 20:49:36 compute-1 systemd[1]: Started Session 9 of User zuul.
Jan 27 20:49:36 compute-1 sshd-session[31243]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:49:37 compute-1 python3.9[31396]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:49:38 compute-1 sudo[31575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxaqqgillzcdoxzyriaoatvkqgnatjeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769546978.461177-40-65808319018458/AnsiballZ_command.py'
Jan 27 20:49:38 compute-1 sudo[31575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:49:39 compute-1 python3.9[31577]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:49:47 compute-1 sudo[31575]: pam_unix(sudo:session): session closed for user root
Jan 27 20:49:47 compute-1 sshd-session[31246]: Connection closed by 192.168.122.30 port 54714
Jan 27 20:49:47 compute-1 sshd-session[31243]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:49:47 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Jan 27 20:49:47 compute-1 systemd[1]: session-9.scope: Consumed 8.478s CPU time.
Jan 27 20:49:47 compute-1 systemd-logind[786]: Session 9 logged out. Waiting for processes to exit.
Jan 27 20:49:47 compute-1 systemd-logind[786]: Removed session 9.
Jan 27 20:49:52 compute-1 sshd-session[31634]: Accepted publickey for zuul from 192.168.122.30 port 60306 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:49:52 compute-1 systemd-logind[786]: New session 10 of user zuul.
Jan 27 20:49:52 compute-1 systemd[1]: Started Session 10 of User zuul.
Jan 27 20:49:52 compute-1 sshd-session[31634]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:49:53 compute-1 python3.9[31787]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:49:54 compute-1 sshd-session[31637]: Connection closed by 192.168.122.30 port 60306
Jan 27 20:49:54 compute-1 sshd-session[31634]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:49:54 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Jan 27 20:49:54 compute-1 systemd-logind[786]: Session 10 logged out. Waiting for processes to exit.
Jan 27 20:49:54 compute-1 systemd-logind[786]: Removed session 10.
Jan 27 20:49:57 compute-1 sshd-session[31816]: Invalid user solana from 80.94.92.186 port 53304
Jan 27 20:49:57 compute-1 sshd-session[31816]: Connection closed by invalid user solana 80.94.92.186 port 53304 [preauth]
Jan 27 20:50:10 compute-1 sshd-session[31818]: Accepted publickey for zuul from 192.168.122.30 port 49532 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:50:10 compute-1 systemd-logind[786]: New session 11 of user zuul.
Jan 27 20:50:10 compute-1 systemd[1]: Started Session 11 of User zuul.
Jan 27 20:50:10 compute-1 sshd-session[31818]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:50:11 compute-1 python3.9[31971]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 27 20:50:12 compute-1 python3.9[32145]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:50:13 compute-1 sudo[32295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmltdwrfesicfwtktetqcstvrtgzpxio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547012.8491051-65-95760423937461/AnsiballZ_command.py'
Jan 27 20:50:13 compute-1 sudo[32295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:13 compute-1 python3.9[32297]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:50:13 compute-1 sudo[32295]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:14 compute-1 sudo[32448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eokpchzyxkzgebaxnutuybhhkiiwmuvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547014.058953-89-124108661562717/AnsiballZ_stat.py'
Jan 27 20:50:14 compute-1 sudo[32448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:14 compute-1 python3.9[32450]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:50:14 compute-1 sudo[32448]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:15 compute-1 sudo[32600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvxwlpduvblglnukxjvxvubcpqvszkee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547015.030208-105-274549047667489/AnsiballZ_file.py'
Jan 27 20:50:15 compute-1 sudo[32600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:15 compute-1 python3.9[32602]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:50:15 compute-1 sudo[32600]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:16 compute-1 sudo[32752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rheydhlhgjmgywgfgqkirzpaiyjukghx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547015.916359-121-60706064330603/AnsiballZ_stat.py'
Jan 27 20:50:16 compute-1 sudo[32752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:16 compute-1 python3.9[32754]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:50:16 compute-1 sudo[32752]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:17 compute-1 sudo[32875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcfimueaylwmhchlfzeduvxekrwzyzad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547015.916359-121-60706064330603/AnsiballZ_copy.py'
Jan 27 20:50:17 compute-1 sudo[32875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:17 compute-1 python3.9[32877]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547015.916359-121-60706064330603/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:50:17 compute-1 sudo[32875]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:17 compute-1 sudo[33027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyoxqzehrrdktdwpjhxjjiigwbwgtaku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547017.497795-151-271735284421320/AnsiballZ_setup.py'
Jan 27 20:50:17 compute-1 sudo[33027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:18 compute-1 python3.9[33029]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:50:18 compute-1 sudo[33027]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:19 compute-1 sudo[33183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaxneggkiftthjfgfqttcgvugdwvnhfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547018.6031494-167-66341085956072/AnsiballZ_file.py'
Jan 27 20:50:19 compute-1 sudo[33183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:19 compute-1 python3.9[33185]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:50:19 compute-1 sudo[33183]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:19 compute-1 sudo[33335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zicfphnekzzowggdftmbkvgzajoujtny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547019.6345403-185-59145315729692/AnsiballZ_file.py'
Jan 27 20:50:19 compute-1 sudo[33335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:20 compute-1 python3.9[33337]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:50:20 compute-1 sudo[33335]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:21 compute-1 python3.9[33487]: ansible-ansible.builtin.service_facts Invoked
Jan 27 20:50:26 compute-1 python3.9[33740]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:50:27 compute-1 python3.9[33890]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:50:29 compute-1 python3.9[34044]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:50:30 compute-1 sudo[34200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yerqotpfkeecuvdahiwottabooxvihcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547030.1259995-281-281395235711557/AnsiballZ_setup.py'
Jan 27 20:50:30 compute-1 sudo[34200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:31 compute-1 python3.9[34202]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:50:31 compute-1 sudo[34200]: pam_unix(sudo:session): session closed for user root
Jan 27 20:50:31 compute-1 sudo[34284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agmzmqbbrphsuvbhsudzcjulvfnxntyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547030.1259995-281-281395235711557/AnsiballZ_dnf.py'
Jan 27 20:50:31 compute-1 sudo[34284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:50:31 compute-1 python3.9[34286]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:51:45 compute-1 systemd[1]: Reloading.
Jan 27 20:51:45 compute-1 systemd-rc-local-generator[34708]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:51:46 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 27 20:51:46 compute-1 systemd[1]: Reloading.
Jan 27 20:51:46 compute-1 systemd-rc-local-generator[34757]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:51:46 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 27 20:51:46 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 27 20:51:46 compute-1 systemd[1]: Reloading.
Jan 27 20:51:46 compute-1 systemd-rc-local-generator[34798]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:51:46 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 27 20:51:47 compute-1 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Jan 27 20:51:47 compute-1 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Jan 27 20:52:17 compute-1 sshd-session[34924]: Connection closed by authenticating user root 47.236.13.75 port 38504 [preauth]
Jan 27 20:52:41 compute-1 sshd-session[35016]: Connection closed by 45.148.10.121 port 37052 [preauth]
Jan 27 20:52:48 compute-1 kernel: SELinux:  Converting 2724 SID table entries...
Jan 27 20:52:48 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 20:52:48 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 27 20:52:48 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 20:52:48 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 27 20:52:48 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 20:52:48 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 20:52:48 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 20:52:48 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 27 20:52:48 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 20:52:48 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 20:52:48 compute-1 systemd[1]: Reloading.
Jan 27 20:52:49 compute-1 systemd-rc-local-generator[35134]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:52:49 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 20:52:49 compute-1 sudo[34284]: pam_unix(sudo:session): session closed for user root
Jan 27 20:52:49 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 20:52:49 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 20:52:49 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.082s CPU time.
Jan 27 20:52:49 compute-1 systemd[1]: run-r31f747a007804e1b93357be85b57ec6f.service: Deactivated successfully.
Jan 27 20:52:56 compute-1 sudo[36046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfmnmohvozlldziqrthjbpfohqrjfbnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547175.975668-305-27625941154381/AnsiballZ_command.py'
Jan 27 20:52:56 compute-1 sudo[36046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:52:56 compute-1 python3.9[36048]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:52:57 compute-1 sudo[36046]: pam_unix(sudo:session): session closed for user root
Jan 27 20:52:58 compute-1 sudo[36327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfoibvdeqxmlntxyflhdrfufvokdgxvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547177.8877597-322-210944795791522/AnsiballZ_selinux.py'
Jan 27 20:52:58 compute-1 sudo[36327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:52:58 compute-1 python3.9[36329]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 27 20:52:58 compute-1 sudo[36327]: pam_unix(sudo:session): session closed for user root
Jan 27 20:52:59 compute-1 sudo[36479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zriaunyhfwdrvryysyfmvgdewldaldhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547179.1736438-343-69384727383703/AnsiballZ_command.py'
Jan 27 20:52:59 compute-1 sudo[36479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:52:59 compute-1 python3.9[36481]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 27 20:53:00 compute-1 sudo[36479]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:01 compute-1 sudo[36632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chtkdpmjaieqfuhzxogzspibjzdafgyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547180.9027288-359-203876888645818/AnsiballZ_file.py'
Jan 27 20:53:01 compute-1 sudo[36632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:01 compute-1 python3.9[36634]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:53:01 compute-1 sudo[36632]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:02 compute-1 sudo[36784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txcaokbzglhqtgeyihgtnalpiddosiwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547182.1263788-376-187620280230505/AnsiballZ_mount.py'
Jan 27 20:53:02 compute-1 sudo[36784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:02 compute-1 python3.9[36786]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 27 20:53:02 compute-1 sudo[36784]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:03 compute-1 sudo[36936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbqpjuzwaxvgdilzwmwokfjfikgfkuoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547183.556662-431-135835490022283/AnsiballZ_file.py'
Jan 27 20:53:03 compute-1 sudo[36936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:04 compute-1 python3.9[36938]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:53:04 compute-1 sudo[36936]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:04 compute-1 sudo[37088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slucsqdosczercmpkjnehbannxxdanjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547184.3797681-447-11992569811471/AnsiballZ_stat.py'
Jan 27 20:53:04 compute-1 sudo[37088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:04 compute-1 python3.9[37090]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:53:04 compute-1 sudo[37088]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:05 compute-1 sudo[37211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggzpukmbzzfljkzifkyzcaivjjwuxdxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547184.3797681-447-11992569811471/AnsiballZ_copy.py'
Jan 27 20:53:05 compute-1 sudo[37211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:05 compute-1 python3.9[37213]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547184.3797681-447-11992569811471/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:53:05 compute-1 sudo[37211]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:06 compute-1 sudo[37363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fodwfdztxdysumybkwmfjurbisgxejwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547186.0278072-495-81189191163690/AnsiballZ_stat.py'
Jan 27 20:53:06 compute-1 sudo[37363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:08 compute-1 python3.9[37365]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:53:08 compute-1 sudo[37363]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:10 compute-1 sudo[37515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhddrzsppvvzhlpfvqsqnuwuswigtuod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547190.0409334-511-266208971847296/AnsiballZ_command.py'
Jan 27 20:53:10 compute-1 sudo[37515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:10 compute-1 python3.9[37517]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:53:10 compute-1 sudo[37515]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:11 compute-1 sudo[37668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqjwyrqrfhtqnvwcoxnzmnorpyojxzrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547190.8960118-527-227907108195735/AnsiballZ_file.py'
Jan 27 20:53:11 compute-1 sudo[37668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:11 compute-1 python3.9[37670]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:53:11 compute-1 sudo[37668]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:12 compute-1 sudo[37820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vljjadgvfwtcvnwfavhupmgnbasyqmak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547191.9177613-549-173474020636243/AnsiballZ_getent.py'
Jan 27 20:53:12 compute-1 sudo[37820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:12 compute-1 python3.9[37822]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 27 20:53:12 compute-1 sudo[37820]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:12 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 20:53:13 compute-1 sudo[37974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drbskuzuuicijvnvntwemcfeplekebhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547192.86447-565-73309092117976/AnsiballZ_group.py'
Jan 27 20:53:13 compute-1 sudo[37974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:13 compute-1 python3.9[37976]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 27 20:53:13 compute-1 groupadd[37977]: group added to /etc/group: name=qemu, GID=107
Jan 27 20:53:13 compute-1 groupadd[37977]: group added to /etc/gshadow: name=qemu
Jan 27 20:53:13 compute-1 groupadd[37977]: new group: name=qemu, GID=107
Jan 27 20:53:13 compute-1 sudo[37974]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:14 compute-1 sudo[38132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joluyocsquuylosntnannrhwvqpawaqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547193.9098127-581-257442799466591/AnsiballZ_user.py'
Jan 27 20:53:14 compute-1 sudo[38132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:14 compute-1 python3.9[38134]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 27 20:53:14 compute-1 useradd[38136]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 27 20:53:14 compute-1 sudo[38132]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:15 compute-1 sudo[38292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhdxrvxvfmjdhtsdnswnqqnxbqhxhqse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547194.873332-597-95048599872928/AnsiballZ_getent.py'
Jan 27 20:53:15 compute-1 sudo[38292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:15 compute-1 python3.9[38294]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 27 20:53:15 compute-1 sudo[38292]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:16 compute-1 sudo[38445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arqvhhgcsmvwbygomeqxbdcbigmnrmue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547195.7261796-613-92627855997011/AnsiballZ_group.py'
Jan 27 20:53:16 compute-1 sudo[38445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:16 compute-1 python3.9[38447]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 27 20:53:16 compute-1 groupadd[38448]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 27 20:53:16 compute-1 groupadd[38448]: group added to /etc/gshadow: name=hugetlbfs
Jan 27 20:53:16 compute-1 groupadd[38448]: new group: name=hugetlbfs, GID=42477
Jan 27 20:53:16 compute-1 sudo[38445]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:16 compute-1 sudo[38603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trhnxxjanvogeeqdqwalbxuozfvahegc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547196.4987328-631-119780310476142/AnsiballZ_file.py'
Jan 27 20:53:16 compute-1 sudo[38603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:16 compute-1 python3.9[38605]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 27 20:53:16 compute-1 sudo[38603]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:17 compute-1 sudo[38755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xectaakappdljswdmnzlgfwjturnrzlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547197.3340452-653-19048521175870/AnsiballZ_dnf.py'
Jan 27 20:53:17 compute-1 sudo[38755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:17 compute-1 python3.9[38757]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:53:19 compute-1 sudo[38755]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:19 compute-1 sudo[38908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klfgldovehavbedntvclwgpmxnytusgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547199.6690905-669-194591205374887/AnsiballZ_file.py'
Jan 27 20:53:19 compute-1 sudo[38908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:20 compute-1 python3.9[38910]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:53:20 compute-1 sudo[38908]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:20 compute-1 sudo[39060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukyhlmbvdehxbmccltjncswbsrpofbuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547200.3320882-685-112639628285345/AnsiballZ_stat.py'
Jan 27 20:53:20 compute-1 sudo[39060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:20 compute-1 python3.9[39062]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:53:20 compute-1 sudo[39060]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:21 compute-1 sudo[39183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msinjmupnhnyzotpyeqkswrzbwgoqmhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547200.3320882-685-112639628285345/AnsiballZ_copy.py'
Jan 27 20:53:21 compute-1 sudo[39183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:21 compute-1 python3.9[39185]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547200.3320882-685-112639628285345/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:53:21 compute-1 sudo[39183]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:22 compute-1 sudo[39335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilxchwozmwzmfmqsbdlasjpqrplnfyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547201.5986753-715-57104468766201/AnsiballZ_systemd.py'
Jan 27 20:53:22 compute-1 sudo[39335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:22 compute-1 python3.9[39337]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:53:22 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 27 20:53:22 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 27 20:53:22 compute-1 kernel: Bridge firewalling registered
Jan 27 20:53:22 compute-1 systemd-modules-load[39341]: Inserted module 'br_netfilter'
Jan 27 20:53:22 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 27 20:53:22 compute-1 sudo[39335]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:23 compute-1 sudo[39495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqswrvvpotsvglctbnmwkyjqtklfdfiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547202.987324-731-281414763117101/AnsiballZ_stat.py'
Jan 27 20:53:23 compute-1 sudo[39495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:23 compute-1 python3.9[39497]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:53:23 compute-1 sudo[39495]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:23 compute-1 sudo[39618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tctjoclugokpgcwephrqxfyecwmngfze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547202.987324-731-281414763117101/AnsiballZ_copy.py'
Jan 27 20:53:23 compute-1 sudo[39618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:24 compute-1 python3.9[39620]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547202.987324-731-281414763117101/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:53:24 compute-1 sudo[39618]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:25 compute-1 sudo[39770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwyjkctwmlddlucwpxbqpjemosixghhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547204.4540753-767-222574171908120/AnsiballZ_dnf.py'
Jan 27 20:53:25 compute-1 sudo[39770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:25 compute-1 python3.9[39772]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:53:34 compute-1 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Jan 27 20:53:34 compute-1 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Jan 27 20:53:35 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 20:53:35 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 20:53:35 compute-1 systemd[1]: Reloading.
Jan 27 20:53:35 compute-1 systemd-rc-local-generator[39866]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:53:35 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 20:53:35 compute-1 sudo[39770]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:36 compute-1 python3.9[41352]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:53:37 compute-1 python3.9[42580]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 27 20:53:38 compute-1 python3.9[43351]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:53:39 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 20:53:39 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 20:53:39 compute-1 systemd[1]: man-db-cache-update.service: Consumed 4.830s CPU time.
Jan 27 20:53:39 compute-1 systemd[1]: run-r250a95a9d450451885a74b7bbc7d3ea0.service: Deactivated successfully.
Jan 27 20:53:39 compute-1 sudo[43974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbpjivixetmhufjwkullowzxlxguipkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547218.9431992-845-185338212652513/AnsiballZ_command.py'
Jan 27 20:53:39 compute-1 sudo[43974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:39 compute-1 python3.9[43976]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:53:39 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 27 20:53:39 compute-1 systemd[1]: Starting Authorization Manager...
Jan 27 20:53:39 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 27 20:53:40 compute-1 polkitd[44193]: Started polkitd version 0.117
Jan 27 20:53:40 compute-1 polkitd[44193]: Loading rules from directory /etc/polkit-1/rules.d
Jan 27 20:53:40 compute-1 polkitd[44193]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 27 20:53:40 compute-1 polkitd[44193]: Finished loading, compiling and executing 2 rules
Jan 27 20:53:40 compute-1 systemd[1]: Started Authorization Manager.
Jan 27 20:53:40 compute-1 polkitd[44193]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 27 20:53:40 compute-1 sudo[43974]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:40 compute-1 sudo[44361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-belglatyahidgxwplrnrrixbctleghck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547220.6197355-863-255191112139315/AnsiballZ_systemd.py'
Jan 27 20:53:40 compute-1 sudo[44361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:41 compute-1 python3.9[44363]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:53:41 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 27 20:53:41 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 27 20:53:41 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 27 20:53:41 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 27 20:53:41 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 27 20:53:41 compute-1 sudo[44361]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:42 compute-1 python3.9[44525]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 27 20:53:45 compute-1 sudo[44675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxiydwxfozjtxmkebllkmajsrllejifc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547224.93722-977-16683760550963/AnsiballZ_systemd.py'
Jan 27 20:53:45 compute-1 sudo[44675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:45 compute-1 python3.9[44677]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:53:45 compute-1 systemd[1]: Reloading.
Jan 27 20:53:45 compute-1 systemd-rc-local-generator[44707]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:53:45 compute-1 sudo[44675]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:46 compute-1 sudo[44864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pixareiyggahyrxfyukzxkrrvcdbgsau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547225.9189174-977-96345615350392/AnsiballZ_systemd.py'
Jan 27 20:53:46 compute-1 sudo[44864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:46 compute-1 python3.9[44866]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:53:46 compute-1 systemd[1]: Reloading.
Jan 27 20:53:46 compute-1 systemd-rc-local-generator[44897]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:53:46 compute-1 sudo[44864]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:47 compute-1 sudo[45054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lloaojycvuzpnptmfeovdoojtbgkzzyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547227.4960856-1009-164990619438530/AnsiballZ_command.py'
Jan 27 20:53:47 compute-1 sudo[45054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:47 compute-1 python3.9[45056]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:53:48 compute-1 sudo[45054]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:48 compute-1 sudo[45207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agkkyxqqaqrqzboqbnfvdnlqxzssygoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547228.1930547-1025-159403735263278/AnsiballZ_command.py'
Jan 27 20:53:48 compute-1 sudo[45207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:48 compute-1 python3.9[45209]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:53:48 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 27 20:53:48 compute-1 sudo[45207]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:49 compute-1 sudo[45360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfvuvjonkbdqtwrmimkwvbtryilnxyhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547229.1724193-1041-78305084376549/AnsiballZ_command.py'
Jan 27 20:53:49 compute-1 sudo[45360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:49 compute-1 python3.9[45362]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:53:51 compute-1 sudo[45360]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:51 compute-1 sudo[45522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmmhxknwcbwvmgdiattuyzhuphfhlcze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547231.3225856-1057-126387012757199/AnsiballZ_command.py'
Jan 27 20:53:51 compute-1 sudo[45522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:51 compute-1 python3.9[45524]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:53:51 compute-1 sudo[45522]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:52 compute-1 sudo[45675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdiufnujfcvvrfspqpfrcggbwicvfjfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547232.1142645-1073-115522241846491/AnsiballZ_systemd.py'
Jan 27 20:53:52 compute-1 sudo[45675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:53:52 compute-1 python3.9[45677]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:53:52 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 27 20:53:52 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Jan 27 20:53:52 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Jan 27 20:53:52 compute-1 systemd[1]: Starting Apply Kernel Variables...
Jan 27 20:53:52 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 27 20:53:52 compute-1 systemd[1]: Finished Apply Kernel Variables.
Jan 27 20:53:52 compute-1 sudo[45675]: pam_unix(sudo:session): session closed for user root
Jan 27 20:53:53 compute-1 sshd-session[31821]: Connection closed by 192.168.122.30 port 49532
Jan 27 20:53:53 compute-1 sshd-session[31818]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:53:53 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Jan 27 20:53:53 compute-1 systemd[1]: session-11.scope: Consumed 2min 20.373s CPU time.
Jan 27 20:53:53 compute-1 systemd-logind[786]: Session 11 logged out. Waiting for processes to exit.
Jan 27 20:53:53 compute-1 systemd-logind[786]: Removed session 11.
Jan 27 20:53:59 compute-1 sshd-session[45707]: Accepted publickey for zuul from 192.168.122.30 port 55206 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:53:59 compute-1 systemd-logind[786]: New session 12 of user zuul.
Jan 27 20:53:59 compute-1 systemd[1]: Started Session 12 of User zuul.
Jan 27 20:53:59 compute-1 sshd-session[45707]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:54:00 compute-1 python3.9[45860]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:54:01 compute-1 python3.9[46014]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:54:02 compute-1 sudo[46168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvrhceynvmvmritsxzkgaefhecjlhhon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547242.1596065-76-150322152311946/AnsiballZ_command.py'
Jan 27 20:54:02 compute-1 sudo[46168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:02 compute-1 python3.9[46170]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:54:02 compute-1 sudo[46168]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:04 compute-1 python3.9[46321]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:54:04 compute-1 sudo[46475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyxcrzngfzzvczspifgnrzgjfjamnqpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547244.4347749-116-22557780169092/AnsiballZ_setup.py'
Jan 27 20:54:04 compute-1 sudo[46475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:05 compute-1 python3.9[46477]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:54:05 compute-1 sudo[46475]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:05 compute-1 sudo[46559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfaebdebvsiudpajyadjjslpejqwtkyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547244.4347749-116-22557780169092/AnsiballZ_dnf.py'
Jan 27 20:54:05 compute-1 sudo[46559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:05 compute-1 python3.9[46561]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:54:07 compute-1 sudo[46559]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:07 compute-1 sudo[46712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apkkzvagbabvtjnbfljxwnjriylornyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547247.3550947-140-190790144840453/AnsiballZ_setup.py'
Jan 27 20:54:07 compute-1 sudo[46712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:07 compute-1 python3.9[46714]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:54:08 compute-1 sudo[46712]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:08 compute-1 sudo[46883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpbkutlgazdrflcapqyirykloueruqyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547248.4300237-162-177161215002882/AnsiballZ_file.py'
Jan 27 20:54:08 compute-1 sudo[46883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:09 compute-1 python3.9[46885]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:54:09 compute-1 sudo[46883]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:09 compute-1 sudo[47035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaxqpzclkgcqjwabbqvohxudyojunqsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547249.2829258-178-257129810602731/AnsiballZ_command.py'
Jan 27 20:54:09 compute-1 sudo[47035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:09 compute-1 python3.9[47037]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:54:09 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2794956493-merged.mount: Deactivated successfully.
Jan 27 20:54:09 compute-1 podman[47038]: 2026-01-27 20:54:09.878861295 +0000 UTC m=+0.058952505 system refresh
Jan 27 20:54:09 compute-1 sudo[47035]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:10 compute-1 sudo[47197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dueqsrfooofnpnayggmfcikhwcrmytch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547250.1062765-194-131622358093804/AnsiballZ_stat.py'
Jan 27 20:54:10 compute-1 sudo[47197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:10 compute-1 python3.9[47199]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:54:10 compute-1 sudo[47197]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:10 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:54:11 compute-1 sudo[47320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvoighwilrdarnfywjkckbfozlxovyci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547250.1062765-194-131622358093804/AnsiballZ_copy.py'
Jan 27 20:54:11 compute-1 sudo[47320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:11 compute-1 python3.9[47322]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547250.1062765-194-131622358093804/.source.json follow=False _original_basename=podman_network_config.j2 checksum=e212bdacdc2edae4975aadac0890e378e1330751 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:54:11 compute-1 sudo[47320]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:11 compute-1 sudo[47472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpifnaaypdsamsnajhdqenmpdgakvjkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547251.673429-224-36904608870941/AnsiballZ_stat.py'
Jan 27 20:54:11 compute-1 sudo[47472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:12 compute-1 python3.9[47474]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:54:12 compute-1 sudo[47472]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:12 compute-1 sudo[47595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgyfwuociodvpfgbwkqutuydtktjtrhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547251.673429-224-36904608870941/AnsiballZ_copy.py'
Jan 27 20:54:12 compute-1 sudo[47595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:12 compute-1 python3.9[47597]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547251.673429-224-36904608870941/.source.conf follow=False _original_basename=registries.conf.j2 checksum=3d72769785e04dd3ae90416f7325c617e0f9262b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:54:12 compute-1 sudo[47595]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:13 compute-1 sudo[47747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twrmpakztnhpmgdmpuvhuwnbrsxcjejb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547253.067237-256-107228994030011/AnsiballZ_ini_file.py'
Jan 27 20:54:13 compute-1 sudo[47747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:13 compute-1 python3.9[47749]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:54:13 compute-1 sudo[47747]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:14 compute-1 sudo[47899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijmozvuruualjsaflyeqljxdxbuvqcrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547253.8681874-256-26894065954813/AnsiballZ_ini_file.py'
Jan 27 20:54:14 compute-1 sudo[47899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:14 compute-1 python3.9[47901]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:54:14 compute-1 sudo[47899]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:14 compute-1 sudo[48051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlzhqtsalykbxiuhvelddxdzticszuxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547254.4958086-256-246952926513484/AnsiballZ_ini_file.py'
Jan 27 20:54:14 compute-1 sudo[48051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:14 compute-1 python3.9[48053]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:54:14 compute-1 sudo[48051]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:15 compute-1 sudo[48203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqblqofwxosadvfntxrlkumjqaypozfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547255.1010172-256-80492261803912/AnsiballZ_ini_file.py'
Jan 27 20:54:15 compute-1 sudo[48203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:15 compute-1 python3.9[48205]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:54:15 compute-1 sudo[48203]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:16 compute-1 python3.9[48355]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:54:17 compute-1 sudo[48507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jipgjskyofyytnpmedhwphlnzmjbtyep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547257.073989-336-121754861737052/AnsiballZ_dnf.py'
Jan 27 20:54:17 compute-1 sudo[48507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:17 compute-1 python3.9[48509]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:18 compute-1 sudo[48507]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:19 compute-1 sudo[48660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ednrbbglzxmaotjgxwkiozesfgjqwjtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547259.0401664-352-110247916144863/AnsiballZ_dnf.py'
Jan 27 20:54:19 compute-1 sudo[48660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:19 compute-1 python3.9[48662]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:22 compute-1 sudo[48660]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:23 compute-1 sudo[48820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zthrknfjjajbldrecjtepzdwutcjfczd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547262.934654-372-49707253543470/AnsiballZ_dnf.py'
Jan 27 20:54:23 compute-1 sudo[48820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:23 compute-1 python3.9[48822]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:24 compute-1 sudo[48820]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:25 compute-1 sudo[48973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abjwfwbamnvzuzruuiksvsshjlmygxon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547265.1898143-390-110113708340727/AnsiballZ_dnf.py'
Jan 27 20:54:25 compute-1 sudo[48973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:25 compute-1 python3.9[48975]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:27 compute-1 sudo[48973]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:28 compute-1 sudo[49126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihascwwjkqbeknljzlwvzuntansbmatq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547267.654638-412-178430030868803/AnsiballZ_dnf.py'
Jan 27 20:54:28 compute-1 sudo[49126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:28 compute-1 python3.9[49128]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:29 compute-1 sudo[49126]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:30 compute-1 sudo[49282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icwqabkehynusptekhvgrpkgacopuyty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547270.073714-428-175114479806173/AnsiballZ_dnf.py'
Jan 27 20:54:30 compute-1 sudo[49282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:30 compute-1 python3.9[49284]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:34 compute-1 sudo[49282]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:34 compute-1 sudo[49451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srutzfgjpswopzkjopvixifrlljecltt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547274.4017065-446-231403659873055/AnsiballZ_dnf.py'
Jan 27 20:54:34 compute-1 sudo[49451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:34 compute-1 python3.9[49453]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:36 compute-1 sudo[49451]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:36 compute-1 sudo[49604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abgwoceyxupiezxvuohvrcfjqqrczbvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547276.4734643-464-132013361123638/AnsiballZ_dnf.py'
Jan 27 20:54:36 compute-1 sudo[49604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:37 compute-1 python3.9[49606]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:44 compute-1 sshd-session[49633]: Invalid user solana from 80.94.92.186 port 56300
Jan 27 20:54:44 compute-1 sshd-session[49633]: Connection closed by invalid user solana 80.94.92.186 port 56300 [preauth]
Jan 27 20:54:51 compute-1 sudo[49604]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:56 compute-1 sudo[49980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvtwdurqwbhagqtskmgfnqmqorobnjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547296.392156-482-42212312738442/AnsiballZ_dnf.py'
Jan 27 20:54:56 compute-1 sudo[49980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:56 compute-1 python3.9[49982]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:54:58 compute-1 sudo[49980]: pam_unix(sudo:session): session closed for user root
Jan 27 20:54:58 compute-1 sudo[50136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxashrphxxwvjbsckhpyzgnkfxitaocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547298.6678572-502-60370154678952/AnsiballZ_dnf.py'
Jan 27 20:54:58 compute-1 sudo[50136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:54:59 compute-1 python3.9[50138]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:55:00 compute-1 sudo[50136]: pam_unix(sudo:session): session closed for user root
Jan 27 20:55:01 compute-1 sudo[50293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msrnmgxrxvlejkqiskviixwvpxwiuxtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547301.394033-524-170098620602722/AnsiballZ_file.py'
Jan 27 20:55:01 compute-1 sudo[50293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:55:01 compute-1 python3.9[50295]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:55:01 compute-1 sudo[50293]: pam_unix(sudo:session): session closed for user root
Jan 27 20:55:02 compute-1 sudo[50468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bunxoncqzslgodyspuscsvstsjtgnxrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547302.143032-540-159278300973324/AnsiballZ_stat.py'
Jan 27 20:55:02 compute-1 sudo[50468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:55:02 compute-1 python3.9[50470]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:55:02 compute-1 sudo[50468]: pam_unix(sudo:session): session closed for user root
Jan 27 20:55:03 compute-1 sudo[50591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aunogpnibyexavplyrqxtzoszbicapsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547302.143032-540-159278300973324/AnsiballZ_copy.py'
Jan 27 20:55:03 compute-1 sudo[50591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:55:03 compute-1 python3.9[50593]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769547302.143032-540-159278300973324/.source.json _original_basename=.kxofdudt follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:55:03 compute-1 sudo[50591]: pam_unix(sudo:session): session closed for user root
Jan 27 20:55:04 compute-1 sudo[50743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzbqqnbfyqdkzqbxbdtxrmsrcqhornjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547304.070464-576-158910031241633/AnsiballZ_podman_image.py'
Jan 27 20:55:04 compute-1 sudo[50743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:55:04 compute-1 python3.9[50745]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 27 20:55:04 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:06 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3529879569-lower\x2dmapped.mount: Deactivated successfully.
Jan 27 20:55:10 compute-1 podman[50757]: 2026-01-27 20:55:10.534526613 +0000 UTC m=+5.695167281 image pull 8cb1c5bd5110b926616067efa18d7f44906d7840bd53541459116085a1a2a2ac 38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 27 20:55:10 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:10 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:10 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:10 compute-1 sudo[50743]: pam_unix(sudo:session): session closed for user root
Jan 27 20:55:11 compute-1 sudo[51054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wixmmowcjubjpjqqbhhxbveqrhxwlgss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547311.437445-598-245157023453578/AnsiballZ_podman_image.py'
Jan 27 20:55:11 compute-1 sudo[51054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:55:11 compute-1 python3.9[51056]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 27 20:55:11 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:19 compute-1 podman[51068]: 2026-01-27 20:55:19.867656428 +0000 UTC m=+7.911337527 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 20:55:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:20 compute-1 sudo[51054]: pam_unix(sudo:session): session closed for user root
Jan 27 20:55:21 compute-1 sudo[51365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmuacnaipbasdakprgfeapmtbklsdvhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547321.4982502-618-71231511203683/AnsiballZ_podman_image.py'
Jan 27 20:55:21 compute-1 sudo[51365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:55:21 compute-1 python3.9[51367]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 27 20:55:22 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:34 compute-1 podman[51379]: 2026-01-27 20:55:34.35724123 +0000 UTC m=+12.301480779 image pull a5aa004c3a6db392cb04fafa2aacae4b2b1bb5836e3e54d23b692771193184c9 38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 27 20:55:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:34 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:55:34 compute-1 sudo[51365]: pam_unix(sudo:session): session closed for user root
Jan 27 20:55:59 compute-1 sudo[51651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiiwpzxnvaeqgykaqenzcjxwuzjuukdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547359.6798735-640-152519859734205/AnsiballZ_podman_image.py'
Jan 27 20:55:59 compute-1 sudo[51651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:00 compute-1 python3.9[51653]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.195:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 27 20:56:00 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:56:02 compute-1 podman[51666]: 2026-01-27 20:56:02.08713164 +0000 UTC m=+1.835188514 image pull 3dd48a6b6936c2fe494640768563fa6ef778e9a42117757cc1008eb89dc8671f 38.102.83.195:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Jan 27 20:56:02 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:56:02 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:56:02 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:56:02 compute-1 sudo[51651]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:02 compute-1 sudo[51916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsgjlkynfreozhiyobzclmgxmhhqurxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547362.4964542-640-73658128022431/AnsiballZ_podman_image.py'
Jan 27 20:56:02 compute-1 sudo[51916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:02 compute-1 python3.9[51918]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 27 20:56:05 compute-1 podman[51929]: 2026-01-27 20:56:05.769672043 +0000 UTC m=+2.743602143 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 27 20:56:05 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:56:05 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:56:05 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:56:06 compute-1 sudo[51916]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:06 compute-1 sshd-session[45710]: Connection closed by 192.168.122.30 port 55206
Jan 27 20:56:06 compute-1 sshd-session[45707]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:56:06 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Jan 27 20:56:06 compute-1 systemd[1]: session-12.scope: Consumed 1min 52.586s CPU time.
Jan 27 20:56:06 compute-1 systemd-logind[786]: Session 12 logged out. Waiting for processes to exit.
Jan 27 20:56:06 compute-1 systemd-logind[786]: Removed session 12.
Jan 27 20:56:12 compute-1 sshd-session[52073]: Accepted publickey for zuul from 192.168.122.30 port 38052 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:56:12 compute-1 systemd-logind[786]: New session 13 of user zuul.
Jan 27 20:56:12 compute-1 systemd[1]: Started Session 13 of User zuul.
Jan 27 20:56:12 compute-1 sshd-session[52073]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:56:13 compute-1 python3.9[52226]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:56:14 compute-1 sudo[52380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yngjfbtordmcodnyseyooqzhueascdtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547373.835686-48-36074944824944/AnsiballZ_getent.py'
Jan 27 20:56:14 compute-1 sudo[52380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:14 compute-1 python3.9[52382]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 27 20:56:14 compute-1 sudo[52380]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:15 compute-1 sudo[52533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaoqhtzxfigujzjcbquwblnlddfjfhak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547374.7367415-64-119673579911103/AnsiballZ_group.py'
Jan 27 20:56:15 compute-1 sudo[52533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:15 compute-1 python3.9[52535]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 27 20:56:15 compute-1 groupadd[52536]: group added to /etc/group: name=openvswitch, GID=42476
Jan 27 20:56:15 compute-1 groupadd[52536]: group added to /etc/gshadow: name=openvswitch
Jan 27 20:56:15 compute-1 groupadd[52536]: new group: name=openvswitch, GID=42476
Jan 27 20:56:15 compute-1 sudo[52533]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:16 compute-1 sudo[52691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxmfddoqoxyvmwdwwsesiarhocgksufh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547375.6522098-80-124994817158618/AnsiballZ_user.py'
Jan 27 20:56:16 compute-1 sudo[52691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:16 compute-1 python3.9[52693]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 27 20:56:16 compute-1 useradd[52695]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 27 20:56:16 compute-1 useradd[52695]: add 'openvswitch' to group 'hugetlbfs'
Jan 27 20:56:16 compute-1 useradd[52695]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 27 20:56:16 compute-1 sudo[52691]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:17 compute-1 sudo[52851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyojzuhygcrfyswjektmlcqrrdtqglas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547376.972268-100-265360401882848/AnsiballZ_setup.py'
Jan 27 20:56:17 compute-1 sudo[52851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:17 compute-1 python3.9[52853]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:56:17 compute-1 sudo[52851]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:18 compute-1 sudo[52935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rimtrirzzbvfeljmlbkjvnlskpapcpcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547376.972268-100-265360401882848/AnsiballZ_dnf.py'
Jan 27 20:56:18 compute-1 sudo[52935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:18 compute-1 python3.9[52937]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 20:56:20 compute-1 sudo[52935]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:20 compute-1 sudo[53097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvfptdxeblgtozdihgxnemcpfjtudntn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547380.4970767-128-47711457269801/AnsiballZ_dnf.py'
Jan 27 20:56:20 compute-1 sudo[53097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:21 compute-1 python3.9[53099]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:56:33 compute-1 kernel: SELinux:  Converting 2738 SID table entries...
Jan 27 20:56:33 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 20:56:33 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 27 20:56:33 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 20:56:33 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 27 20:56:33 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 20:56:33 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 20:56:33 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 20:56:33 compute-1 groupadd[53122]: group added to /etc/group: name=unbound, GID=994
Jan 27 20:56:33 compute-1 groupadd[53122]: group added to /etc/gshadow: name=unbound
Jan 27 20:56:33 compute-1 groupadd[53122]: new group: name=unbound, GID=994
Jan 27 20:56:33 compute-1 useradd[53129]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 27 20:56:33 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 27 20:56:33 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 27 20:56:34 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 20:56:34 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 20:56:35 compute-1 systemd[1]: Reloading.
Jan 27 20:56:35 compute-1 systemd-sysv-generator[53631]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:56:35 compute-1 systemd-rc-local-generator[53628]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:56:35 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 20:56:35 compute-1 sudo[53097]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:35 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 20:56:35 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 20:56:35 compute-1 systemd[1]: run-r3dcfeb98cbee49c2ae3b4691ac27a391.service: Deactivated successfully.
Jan 27 20:56:39 compute-1 sudo[54195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkoiflnjbqfpqemflmufkwralytitsor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547398.521884-144-262024413356381/AnsiballZ_systemd.py'
Jan 27 20:56:39 compute-1 sudo[54195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:39 compute-1 python3.9[54197]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 20:56:39 compute-1 systemd[1]: Reloading.
Jan 27 20:56:39 compute-1 systemd-rc-local-generator[54224]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:56:39 compute-1 systemd-sysv-generator[54227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:56:39 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Jan 27 20:56:39 compute-1 chown[54239]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 27 20:56:39 compute-1 ovs-ctl[54244]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 27 20:56:39 compute-1 ovs-ctl[54244]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 27 20:56:39 compute-1 ovs-ctl[54244]: Starting ovsdb-server [  OK  ]
Jan 27 20:56:39 compute-1 ovs-vsctl[54293]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 27 20:56:40 compute-1 ovs-vsctl[54313]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"af804609-b297-47b2-80af-51c874daa876\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 27 20:56:40 compute-1 ovs-ctl[54244]: Configuring Open vSwitch system IDs [  OK  ]
Jan 27 20:56:40 compute-1 ovs-ctl[54244]: Enabling remote OVSDB managers [  OK  ]
Jan 27 20:56:40 compute-1 ovs-vsctl[54319]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 27 20:56:40 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Jan 27 20:56:40 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 27 20:56:40 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 27 20:56:40 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 27 20:56:40 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Jan 27 20:56:40 compute-1 ovs-ctl[54363]: Inserting openvswitch module [  OK  ]
Jan 27 20:56:40 compute-1 ovs-ctl[54332]: Starting ovs-vswitchd [  OK  ]
Jan 27 20:56:40 compute-1 ovs-vsctl[54380]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 27 20:56:40 compute-1 ovs-ctl[54332]: Enabling remote OVSDB managers [  OK  ]
Jan 27 20:56:40 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 27 20:56:40 compute-1 systemd[1]: Starting Open vSwitch...
Jan 27 20:56:40 compute-1 systemd[1]: Finished Open vSwitch.
Jan 27 20:56:40 compute-1 sudo[54195]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:41 compute-1 python3.9[54532]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:56:42 compute-1 sudo[54682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbsfxcuryntohnonmezzwhvbvdvylgie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547401.858999-180-37525363501804/AnsiballZ_sefcontext.py'
Jan 27 20:56:42 compute-1 sudo[54682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:42 compute-1 python3.9[54684]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 27 20:56:43 compute-1 kernel: SELinux:  Converting 2752 SID table entries...
Jan 27 20:56:43 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 20:56:43 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 27 20:56:43 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 20:56:43 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 27 20:56:43 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 20:56:43 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 20:56:43 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 20:56:43 compute-1 sudo[54682]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:44 compute-1 python3.9[54839]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:56:45 compute-1 sudo[54995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcvngudylhcswbwkqiwlecuhbcitjwdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547405.3052218-216-81564587070481/AnsiballZ_dnf.py'
Jan 27 20:56:45 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 27 20:56:45 compute-1 sudo[54995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:45 compute-1 python3.9[54997]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:56:47 compute-1 sudo[54995]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:47 compute-1 sudo[55148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uknnzkxfbmqhfatupuvnfcdilccveobi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547407.3693044-232-159407743331778/AnsiballZ_command.py'
Jan 27 20:56:47 compute-1 sudo[55148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:48 compute-1 python3.9[55150]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:56:48 compute-1 sudo[55148]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:49 compute-1 sudo[55435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cykptrulvukarxfbyfnpyifbemzoqxoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547409.1409564-248-175689967100787/AnsiballZ_file.py'
Jan 27 20:56:49 compute-1 sudo[55435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:49 compute-1 python3.9[55437]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 27 20:56:49 compute-1 sudo[55435]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:50 compute-1 python3.9[55587]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:56:51 compute-1 sudo[55739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjrmofwajzlapoufrgbjsficbnbmkrlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547410.7824268-280-139479171286683/AnsiballZ_dnf.py'
Jan 27 20:56:51 compute-1 sudo[55739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:51 compute-1 python3.9[55741]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:56:53 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 20:56:53 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 20:56:53 compute-1 systemd[1]: Reloading.
Jan 27 20:56:53 compute-1 systemd-sysv-generator[55784]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:56:53 compute-1 systemd-rc-local-generator[55781]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:56:53 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 20:56:53 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 20:56:53 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 20:56:53 compute-1 systemd[1]: run-r640f6b9bf868434d8727eef83270637c.service: Deactivated successfully.
Jan 27 20:56:53 compute-1 sudo[55739]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:54 compute-1 sudo[56055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mncwxovekdteyagumlkpallyglluivwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547414.042438-296-8060206996907/AnsiballZ_systemd.py'
Jan 27 20:56:54 compute-1 sudo[56055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:54 compute-1 python3.9[56057]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:56:54 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 27 20:56:54 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Jan 27 20:56:54 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Jan 27 20:56:54 compute-1 systemd[1]: Stopping Network Manager...
Jan 27 20:56:54 compute-1 NetworkManager[7199]: <info>  [1769547414.8042] caught SIGTERM, shutting down normally.
Jan 27 20:56:54 compute-1 NetworkManager[7199]: <info>  [1769547414.8065] dhcp4 (eth0): canceled DHCP transaction
Jan 27 20:56:54 compute-1 NetworkManager[7199]: <info>  [1769547414.8065] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:56:54 compute-1 NetworkManager[7199]: <info>  [1769547414.8065] dhcp4 (eth0): state changed no lease
Jan 27 20:56:54 compute-1 NetworkManager[7199]: <info>  [1769547414.8068] manager: NetworkManager state is now CONNECTED_SITE
Jan 27 20:56:54 compute-1 NetworkManager[7199]: <info>  [1769547414.8142] exiting (success)
Jan 27 20:56:54 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 27 20:56:54 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 27 20:56:54 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 27 20:56:54 compute-1 systemd[1]: Stopped Network Manager.
Jan 27 20:56:54 compute-1 systemd[1]: NetworkManager.service: Consumed 28.895s CPU time, 4.1M memory peak, read 0B from disk, written 17.5K to disk.
Jan 27 20:56:54 compute-1 systemd[1]: Starting Network Manager...
Jan 27 20:56:54 compute-1 NetworkManager[56069]: <info>  [1769547414.8895] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ee988f58-1e06-463e-8261-22f688d902e1)
Jan 27 20:56:54 compute-1 NetworkManager[56069]: <info>  [1769547414.8896] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 27 20:56:54 compute-1 NetworkManager[56069]: <info>  [1769547414.8956] manager[0x56017ac74000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 27 20:56:54 compute-1 systemd[1]: Starting Hostname Service...
Jan 27 20:56:55 compute-1 systemd[1]: Started Hostname Service.
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0095] hostname: hostname: using hostnamed
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0096] hostname: static hostname changed from (none) to "compute-1"
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0104] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0111] manager[0x56017ac74000]: rfkill: Wi-Fi hardware radio set enabled
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0111] manager[0x56017ac74000]: rfkill: WWAN hardware radio set enabled
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0137] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0149] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0150] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0151] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0152] manager: Networking is enabled by state file
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0156] settings: Loaded settings plugin: keyfile (internal)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0160] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0196] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0211] dhcp: init: Using DHCP client 'internal'
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0214] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0223] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0230] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0241] device (lo): Activation: starting connection 'lo' (f9304b27-0492-4654-ac3b-87bfd4814846)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0250] device (eth0): carrier: link connected
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0255] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0263] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0264] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0273] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0283] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0292] device (eth1): carrier: link connected
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0297] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0306] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (d9c8ffab-7346-50a5-a026-9fa25258efe7) (indicated)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0306] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0314] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0325] device (eth1): Activation: starting connection 'ci-private-network' (d9c8ffab-7346-50a5-a026-9fa25258efe7)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0334] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 27 20:56:55 compute-1 systemd[1]: Started Network Manager.
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0353] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0355] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0357] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0360] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0364] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0367] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0371] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0376] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0385] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0390] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0403] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0428] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0436] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0438] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0441] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0444] device (lo): Activation: successful, device activated.
Jan 27 20:56:55 compute-1 systemd[1]: Starting Network Manager Wait Online...
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0452] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0506] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0511] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0512] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0514] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0516] device (eth1): Activation: successful, device activated.
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0555] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0556] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0558] manager: NetworkManager state is now CONNECTED_SITE
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0560] device (eth0): Activation: successful, device activated.
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0564] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 27 20:56:55 compute-1 NetworkManager[56069]: <info>  [1769547415.0596] manager: startup complete
Jan 27 20:56:55 compute-1 sudo[56055]: pam_unix(sudo:session): session closed for user root
Jan 27 20:56:55 compute-1 systemd[1]: Finished Network Manager Wait Online.
Jan 27 20:56:55 compute-1 sudo[56281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijmuxgjumxzsfgabkhsurwpoeaypkiwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547415.346471-312-269613430605675/AnsiballZ_dnf.py'
Jan 27 20:56:55 compute-1 sudo[56281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:56:55 compute-1 python3.9[56283]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:57:01 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 20:57:01 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 20:57:01 compute-1 systemd[1]: Reloading.
Jan 27 20:57:01 compute-1 systemd-sysv-generator[56335]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:57:01 compute-1 systemd-rc-local-generator[56331]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:57:01 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 20:57:01 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 20:57:01 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 20:57:01 compute-1 systemd[1]: run-r31c52188411c409196d017552e9b4ac3.service: Deactivated successfully.
Jan 27 20:57:02 compute-1 sudo[56281]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:02 compute-1 sudo[56740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikzselqmesfammujogkvvktvuzrandun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547422.4854686-336-117605352457916/AnsiballZ_stat.py'
Jan 27 20:57:02 compute-1 sudo[56740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:03 compute-1 python3.9[56742]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:57:03 compute-1 sudo[56740]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:03 compute-1 sudo[56892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hipjblqkipbfmqkuwwyqoqicxpwzoeso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547423.2424417-354-52497300614975/AnsiballZ_ini_file.py'
Jan 27 20:57:03 compute-1 sudo[56892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:03 compute-1 python3.9[56894]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:03 compute-1 sudo[56892]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:04 compute-1 sudo[57046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cexghgqztexxhtfijcwzdmegexyjzsqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547424.6076133-374-238991384430356/AnsiballZ_ini_file.py'
Jan 27 20:57:04 compute-1 sudo[57046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:05 compute-1 python3.9[57048]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:05 compute-1 sudo[57046]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:05 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 27 20:57:05 compute-1 sudo[57198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jasyjfogbvrmmazhiwbokiwabzjutvxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547425.2657506-374-173235498606895/AnsiballZ_ini_file.py'
Jan 27 20:57:05 compute-1 sudo[57198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:05 compute-1 python3.9[57200]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:05 compute-1 sudo[57198]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:06 compute-1 sudo[57350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrezzdzusnvqbjbfailbnyhxelijvujm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547425.978579-404-8596273962554/AnsiballZ_ini_file.py'
Jan 27 20:57:06 compute-1 sudo[57350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:06 compute-1 python3.9[57352]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:06 compute-1 sudo[57350]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:07 compute-1 sudo[57502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwevrkwrsgefouoeweknnlgyiukfvnap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547426.7102156-404-143517121668698/AnsiballZ_ini_file.py'
Jan 27 20:57:07 compute-1 sudo[57502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:07 compute-1 python3.9[57504]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:07 compute-1 sudo[57502]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:07 compute-1 sudo[57654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxslhkyuoosxajtohusbiciefjvbepmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547427.518204-434-233530849714159/AnsiballZ_stat.py'
Jan 27 20:57:07 compute-1 sudo[57654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:08 compute-1 python3.9[57656]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:57:08 compute-1 sudo[57654]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:08 compute-1 sudo[57777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pooejfemnfgvkuukvmydearaclfpiqdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547427.518204-434-233530849714159/AnsiballZ_copy.py'
Jan 27 20:57:08 compute-1 sudo[57777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:08 compute-1 python3.9[57779]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547427.518204-434-233530849714159/.source _original_basename=.s_l5h4pj follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:08 compute-1 sudo[57777]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:09 compute-1 sudo[57929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gthqhmuhobrzgflsbzduwldrcqfyoccg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547428.9586442-464-208875053948680/AnsiballZ_file.py'
Jan 27 20:57:09 compute-1 sudo[57929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:09 compute-1 python3.9[57931]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:09 compute-1 sudo[57929]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:10 compute-1 sudo[58081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxoudmqgcfcdhwrdjjcbnpbrttzogbtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547429.812309-480-32888558806920/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 27 20:57:10 compute-1 sudo[58081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:10 compute-1 python3.9[58083]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 27 20:57:10 compute-1 sudo[58081]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:11 compute-1 sudo[58233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qodfckvaumtnodlszmyesfleyjjvplrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547430.7485688-498-164708032260583/AnsiballZ_file.py'
Jan 27 20:57:11 compute-1 sudo[58233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:11 compute-1 python3.9[58235]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:11 compute-1 sudo[58233]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:11 compute-1 sudo[58385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aywpnytatelmtzwalgxygcfdnqrynzlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547431.5847225-518-274577930877574/AnsiballZ_stat.py'
Jan 27 20:57:11 compute-1 sudo[58385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:12 compute-1 sudo[58385]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:12 compute-1 sudo[58508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrtojixxjmulejzmklptocejbfrdrmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547431.5847225-518-274577930877574/AnsiballZ_copy.py'
Jan 27 20:57:12 compute-1 sudo[58508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:12 compute-1 sudo[58508]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:13 compute-1 sudo[58660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggovvaoipvrlwmceidnogrtomgzdfpzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547432.9242482-548-110772354684366/AnsiballZ_slurp.py'
Jan 27 20:57:13 compute-1 sudo[58660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:13 compute-1 python3.9[58662]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 27 20:57:13 compute-1 sudo[58660]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:14 compute-1 sudo[58835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnzwtxpvpyjoeiihgzezltdwmndymegc ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547434.0476704-566-105344347786272/async_wrapper.py j633356186338 300 /home/zuul/.ansible/tmp/ansible-tmp-1769547434.0476704-566-105344347786272/AnsiballZ_edpm_os_net_config.py _'
Jan 27 20:57:14 compute-1 sudo[58835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:15 compute-1 ansible-async_wrapper.py[58837]: Invoked with j633356186338 300 /home/zuul/.ansible/tmp/ansible-tmp-1769547434.0476704-566-105344347786272/AnsiballZ_edpm_os_net_config.py _
Jan 27 20:57:15 compute-1 ansible-async_wrapper.py[58840]: Starting module and watcher
Jan 27 20:57:15 compute-1 ansible-async_wrapper.py[58840]: Start watching 58841 (300)
Jan 27 20:57:15 compute-1 ansible-async_wrapper.py[58841]: Start module (58841)
Jan 27 20:57:15 compute-1 ansible-async_wrapper.py[58837]: Return async_wrapper task started.
Jan 27 20:57:15 compute-1 sudo[58835]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:15 compute-1 python3.9[58842]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 27 20:57:16 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 27 20:57:16 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 27 20:57:16 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 27 20:57:16 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 27 20:57:16 compute-1 kernel: cfg80211: failed to load regulatory.db
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.0965] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.0984] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1503] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1505] audit: op="connection-add" uuid="91360b2c-2d15-47f5-9d15-ca2f60fc2f8b" name="br-ex-br" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1519] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1520] audit: op="connection-add" uuid="6eaf0bf2-354d-4e34-8ca5-dffd164c6bc5" name="br-ex-port" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1531] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1532] audit: op="connection-add" uuid="5a9dcf51-e108-4757-9990-da3a38441b5e" name="eth1-port" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1542] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1544] audit: op="connection-add" uuid="e86a7afe-0edf-46d5-b140-721de070b9f7" name="vlan20-port" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1554] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1555] audit: op="connection-add" uuid="dfe79948-0cf2-4f62-8be3-b61d583c646c" name="vlan21-port" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1566] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1567] audit: op="connection-add" uuid="455860ee-58e4-4a3e-9d8b-974bab9d55a5" name="vlan22-port" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1584] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1599] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1600] audit: op="connection-add" uuid="60b5a013-7a3d-432f-acd6-ef29e11f8a03" name="br-ex-if" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1664] audit: op="connection-update" uuid="d9c8ffab-7346-50a5-a026-9fa25258efe7" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,ipv6.method,ipv6.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routes,ipv4.method,ipv4.never-default,ipv4.dns,ipv4.routes,ipv4.addresses,ipv4.routing-rules,connection.master,connection.controller,connection.timestamp,connection.slave-type,connection.port-type" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1679] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1681] audit: op="connection-add" uuid="5523cdd8-b156-4b54-80c2-0379f3a69210" name="vlan20-if" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1696] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1697] audit: op="connection-add" uuid="332318b7-e91e-42e4-8392-e772452193d8" name="vlan21-if" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1712] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1713] audit: op="connection-add" uuid="62d0eae3-be55-42b7-a048-55f7733c42fc" name="vlan22-if" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1725] audit: op="connection-delete" uuid="2719f028-bb33-30c6-adfd-93b66b18c548" name="Wired connection 1" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1738] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1741] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1746] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1750] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (91360b2c-2d15-47f5-9d15-ca2f60fc2f8b)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1750] audit: op="connection-activate" uuid="91360b2c-2d15-47f5-9d15-ca2f60fc2f8b" name="br-ex-br" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1752] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1753] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1757] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1760] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (6eaf0bf2-354d-4e34-8ca5-dffd164c6bc5)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1762] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1763] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1766] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1770] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (5a9dcf51-e108-4757-9990-da3a38441b5e)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1772] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1772] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1777] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1780] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (e86a7afe-0edf-46d5-b140-721de070b9f7)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1782] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1783] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1787] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1790] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (dfe79948-0cf2-4f62-8be3-b61d583c646c)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1792] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1793] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1797] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1800] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (455860ee-58e4-4a3e-9d8b-974bab9d55a5)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1801] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1803] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1805] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1810] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1811] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1813] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1817] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (60b5a013-7a3d-432f-acd6-ef29e11f8a03)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1817] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1820] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1822] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1823] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1824] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1834] device (eth1): disconnecting for new activation request.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1834] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1836] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1838] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1839] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1841] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1842] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1845] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1849] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (5523cdd8-b156-4b54-80c2-0379f3a69210)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1849] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1852] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1853] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1855] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1857] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1858] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1861] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1864] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (332318b7-e91e-42e4-8392-e772452193d8)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1865] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1868] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1869] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1871] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1873] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <warn>  [1769547437.1874] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1876] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1880] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (62d0eae3-be55-42b7-a048-55f7733c42fc)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1880] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1883] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1884] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1885] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1887] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1898] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1899] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1902] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1903] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1909] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1912] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1915] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1918] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1920] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 kernel: ovs-system: entered promiscuous mode
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1932] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1937] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1941] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 kernel: Timeout policy base is empty
Jan 27 20:57:17 compute-1 systemd-udevd[58848]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1944] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1952] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1956] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1959] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1963] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1969] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1976] dhcp4 (eth0): canceled DHCP transaction
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1977] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1978] dhcp4 (eth0): state changed no lease
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1981] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 27 20:57:17 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.1998] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2004] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58843 uid=0 result="fail" reason="Device is not activated"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2046] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2070] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2077] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 27 20:57:17 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2122] device (eth1): disconnecting for new activation request.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2123] audit: op="connection-activate" uuid="d9c8ffab-7346-50a5-a026-9fa25258efe7" name="ci-private-network" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2126] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2139] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2255] device (eth1): Activation: starting connection 'ci-private-network' (d9c8ffab-7346-50a5-a026-9fa25258efe7)
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2260] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2274] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2276] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2281] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2284] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2288] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2289] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2291] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2292] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2294] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2295] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58843 uid=0 result="success"
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2297] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2305] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 kernel: br-ex: entered promiscuous mode
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2309] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2313] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2317] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2320] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2324] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2329] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2333] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2337] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2342] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2349] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2353] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2366] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2367] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2372] device (eth1): Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2425] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2436] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 kernel: vlan22: entered promiscuous mode
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2465] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2466] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2472] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 27 20:57:17 compute-1 kernel: vlan21: entered promiscuous mode
Jan 27 20:57:17 compute-1 systemd-udevd[58847]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2588] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2598] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2627] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2629] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.2635] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 kernel: vlan20: entered promiscuous mode
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3014] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3017] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3043] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3050] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3061] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3062] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3066] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3073] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3074] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 27 20:57:17 compute-1 NetworkManager[56069]: <info>  [1769547437.3077] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 27 20:57:18 compute-1 NetworkManager[56069]: <info>  [1769547438.4079] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58843 uid=0 result="success"
Jan 27 20:57:18 compute-1 NetworkManager[56069]: <info>  [1769547438.5571] checkpoint[0x56017ac4a950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 27 20:57:18 compute-1 NetworkManager[56069]: <info>  [1769547438.5574] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58843 uid=0 result="success"
Jan 27 20:57:18 compute-1 NetworkManager[56069]: <info>  [1769547438.8331] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58843 uid=0 result="success"
Jan 27 20:57:18 compute-1 NetworkManager[56069]: <info>  [1769547438.8340] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58843 uid=0 result="success"
Jan 27 20:57:18 compute-1 sudo[59175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbjuresfgcxvxzvwbkateqzufjmtxef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547438.319883-566-237186475935621/AnsiballZ_async_status.py'
Jan 27 20:57:18 compute-1 sudo[59175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:19 compute-1 NetworkManager[56069]: <info>  [1769547439.0264] audit: op="networking-control" arg="global-dns-configuration" pid=58843 uid=0 result="success"
Jan 27 20:57:19 compute-1 NetworkManager[56069]: <info>  [1769547439.0291] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 27 20:57:19 compute-1 NetworkManager[56069]: <info>  [1769547439.0323] audit: op="networking-control" arg="global-dns-configuration" pid=58843 uid=0 result="success"
Jan 27 20:57:19 compute-1 NetworkManager[56069]: <info>  [1769547439.0342] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58843 uid=0 result="success"
Jan 27 20:57:19 compute-1 python3.9[59177]: ansible-ansible.legacy.async_status Invoked with jid=j633356186338.58837 mode=status _async_dir=/root/.ansible_async
Jan 27 20:57:19 compute-1 sudo[59175]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:19 compute-1 NetworkManager[56069]: <info>  [1769547439.1602] checkpoint[0x56017ac4aa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 27 20:57:19 compute-1 NetworkManager[56069]: <info>  [1769547439.1607] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58843 uid=0 result="success"
Jan 27 20:57:19 compute-1 ansible-async_wrapper.py[58841]: Module complete (58841)
Jan 27 20:57:20 compute-1 ansible-async_wrapper.py[58840]: Done in kid B.
Jan 27 20:57:22 compute-1 sudo[59280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvlhjagcutiszconsdbntztdzvtwxjev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547438.319883-566-237186475935621/AnsiballZ_async_status.py'
Jan 27 20:57:22 compute-1 sudo[59280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:22 compute-1 python3.9[59282]: ansible-ansible.legacy.async_status Invoked with jid=j633356186338.58837 mode=status _async_dir=/root/.ansible_async
Jan 27 20:57:22 compute-1 sudo[59280]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:22 compute-1 sudo[59380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihlxrgtdaxafixdwyjvtrkdvpirxveds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547438.319883-566-237186475935621/AnsiballZ_async_status.py'
Jan 27 20:57:22 compute-1 sudo[59380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:23 compute-1 python3.9[59382]: ansible-ansible.legacy.async_status Invoked with jid=j633356186338.58837 mode=cleanup _async_dir=/root/.ansible_async
Jan 27 20:57:23 compute-1 sudo[59380]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:23 compute-1 sudo[59532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxvjgmwbsfzfnxxvhwjuywkfcfbibyxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547443.4680438-620-268832351075528/AnsiballZ_stat.py'
Jan 27 20:57:23 compute-1 sudo[59532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:24 compute-1 python3.9[59534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:57:24 compute-1 sudo[59532]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:24 compute-1 sudo[59655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omqlfocuelsblywvevbzbhknxqhsoehh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547443.4680438-620-268832351075528/AnsiballZ_copy.py'
Jan 27 20:57:24 compute-1 sudo[59655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:24 compute-1 python3.9[59657]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547443.4680438-620-268832351075528/.source.returncode _original_basename=.qb39gk5g follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:24 compute-1 sudo[59655]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:25 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 27 20:57:25 compute-1 sudo[59809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xagwycgwvnvubqhzwirlditxquvwqwuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547444.8433561-652-175904167917847/AnsiballZ_stat.py'
Jan 27 20:57:25 compute-1 sudo[59809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:25 compute-1 python3.9[59811]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:57:25 compute-1 sudo[59809]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:25 compute-1 sudo[59933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtmajchwqhgmxodyswqllhgacpiqikiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547444.8433561-652-175904167917847/AnsiballZ_copy.py'
Jan 27 20:57:25 compute-1 sudo[59933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:25 compute-1 python3.9[59935]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547444.8433561-652-175904167917847/.source.cfg _original_basename=.vicgk5lb follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:25 compute-1 sudo[59933]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:26 compute-1 sudo[60085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glzvjshsahthqyxfgxcdyayqebwmhhtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547446.0289953-682-29811326524266/AnsiballZ_systemd.py'
Jan 27 20:57:26 compute-1 sudo[60085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:26 compute-1 python3.9[60087]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:57:26 compute-1 systemd[1]: Reloading Network Manager...
Jan 27 20:57:26 compute-1 NetworkManager[56069]: <info>  [1769547446.8141] audit: op="reload" arg="0" pid=60091 uid=0 result="success"
Jan 27 20:57:26 compute-1 NetworkManager[56069]: <info>  [1769547446.8149] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 27 20:57:26 compute-1 systemd[1]: Reloaded Network Manager.
Jan 27 20:57:26 compute-1 sudo[60085]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:27 compute-1 sshd-session[52076]: Connection closed by 192.168.122.30 port 38052
Jan 27 20:57:27 compute-1 sshd-session[52073]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:57:27 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Jan 27 20:57:27 compute-1 systemd[1]: session-13.scope: Consumed 51.363s CPU time.
Jan 27 20:57:27 compute-1 systemd-logind[786]: Session 13 logged out. Waiting for processes to exit.
Jan 27 20:57:27 compute-1 systemd-logind[786]: Removed session 13.
Jan 27 20:57:32 compute-1 sshd-session[60124]: Accepted publickey for zuul from 192.168.122.30 port 57852 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:57:32 compute-1 systemd-logind[786]: New session 14 of user zuul.
Jan 27 20:57:32 compute-1 systemd[1]: Started Session 14 of User zuul.
Jan 27 20:57:32 compute-1 sshd-session[60124]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:57:33 compute-1 python3.9[60277]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:57:34 compute-1 python3.9[60431]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:57:36 compute-1 python3.9[60621]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:57:36 compute-1 sshd-session[60127]: Connection closed by 192.168.122.30 port 57852
Jan 27 20:57:36 compute-1 sshd-session[60124]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:57:36 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Jan 27 20:57:36 compute-1 systemd[1]: session-14.scope: Consumed 2.318s CPU time.
Jan 27 20:57:36 compute-1 systemd-logind[786]: Session 14 logged out. Waiting for processes to exit.
Jan 27 20:57:36 compute-1 systemd-logind[786]: Removed session 14.
Jan 27 20:57:36 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 27 20:57:41 compute-1 sshd-session[60650]: Accepted publickey for zuul from 192.168.122.30 port 42284 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:57:41 compute-1 systemd-logind[786]: New session 15 of user zuul.
Jan 27 20:57:41 compute-1 systemd[1]: Started Session 15 of User zuul.
Jan 27 20:57:41 compute-1 sshd-session[60650]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:57:42 compute-1 python3.9[60803]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:57:43 compute-1 python3.9[60957]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:57:44 compute-1 sudo[61111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qavcdymxtmwmzdkcjyxvogqabpoqablg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547464.305317-56-15242728131542/AnsiballZ_setup.py'
Jan 27 20:57:44 compute-1 sudo[61111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:44 compute-1 python3.9[61113]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:57:45 compute-1 sudo[61111]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:45 compute-1 sudo[61196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdkbkyfporrdvdkebtqslovtlbdtjoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547464.305317-56-15242728131542/AnsiballZ_dnf.py'
Jan 27 20:57:45 compute-1 sudo[61196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:45 compute-1 python3.9[61198]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:57:46 compute-1 sudo[61196]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:47 compute-1 sudo[61349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emxsbpznbflfwdcpwiykyoyxmqnptlrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547467.285378-80-156455393021797/AnsiballZ_setup.py'
Jan 27 20:57:47 compute-1 sudo[61349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:47 compute-1 python3.9[61351]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:57:48 compute-1 sudo[61349]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:49 compute-1 sudo[61541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcqgjzjruerumpsfyupwsjuwrjgmfosd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547468.5439024-102-133173615875366/AnsiballZ_file.py'
Jan 27 20:57:49 compute-1 sudo[61541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:49 compute-1 python3.9[61543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:49 compute-1 sudo[61541]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:49 compute-1 sudo[61693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enqxeimtftzqmuabbzhzaqfsmnfxuciv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547469.4835386-118-82305662870023/AnsiballZ_command.py'
Jan 27 20:57:49 compute-1 sudo[61693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:50 compute-1 python3.9[61695]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:57:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 20:57:50 compute-1 sudo[61693]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:50 compute-1 sudo[61855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poflhrvfwuwtalpobtmsgyaqytpnfbng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547470.4592001-134-171044142628541/AnsiballZ_stat.py'
Jan 27 20:57:50 compute-1 sudo[61855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:51 compute-1 python3.9[61857]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:57:51 compute-1 sudo[61855]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:51 compute-1 sudo[61933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skvlkbjjfmsfkqpbyooadfnjhijvlwhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547470.4592001-134-171044142628541/AnsiballZ_file.py'
Jan 27 20:57:51 compute-1 sudo[61933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:51 compute-1 python3.9[61935]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:57:51 compute-1 sudo[61933]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:52 compute-1 sudo[62085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnqnuvbvzldokbrbhiijcxynmvjrcxly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547471.914013-158-167158114675875/AnsiballZ_stat.py'
Jan 27 20:57:52 compute-1 sudo[62085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:52 compute-1 python3.9[62087]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:57:52 compute-1 sudo[62085]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:52 compute-1 sudo[62163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyrusjimtlyhhirrzqycnscktsboybar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547471.914013-158-167158114675875/AnsiballZ_file.py'
Jan 27 20:57:52 compute-1 sudo[62163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:52 compute-1 python3.9[62165]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:57:52 compute-1 sudo[62163]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:53 compute-1 sudo[62315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xotqypfpumccnzelnvrimccsgwskfuqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547473.1941175-184-193761055401846/AnsiballZ_ini_file.py'
Jan 27 20:57:53 compute-1 sudo[62315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:53 compute-1 python3.9[62317]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:57:53 compute-1 sudo[62315]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:54 compute-1 sudo[62467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfilbtogumfydtigzcfrqveeyzwudhpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547474.0330548-184-254170691890829/AnsiballZ_ini_file.py'
Jan 27 20:57:54 compute-1 sudo[62467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:54 compute-1 python3.9[62469]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:57:54 compute-1 sudo[62467]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:55 compute-1 sudo[62619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkmfyfizhrjicxybhzwhfvknqjweyyks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547474.7190351-184-260249132704418/AnsiballZ_ini_file.py'
Jan 27 20:57:55 compute-1 sudo[62619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:55 compute-1 python3.9[62621]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:57:55 compute-1 sudo[62619]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:55 compute-1 sudo[62771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kriqvbfihvzmigejgradpnlkskwrwbkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547475.3298552-184-48802379699565/AnsiballZ_ini_file.py'
Jan 27 20:57:55 compute-1 sudo[62771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:55 compute-1 python3.9[62773]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:57:55 compute-1 sudo[62771]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:56 compute-1 sudo[62923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfydvrvzxvwqcizwbfqkidcoijbyyiiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547476.2929518-246-75996431686262/AnsiballZ_dnf.py'
Jan 27 20:57:56 compute-1 sudo[62923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:56 compute-1 python3.9[62925]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:57:58 compute-1 sudo[62923]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:58 compute-1 sudo[63076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybojjhpuzkjqmhuvhqhxkpujekdkrukr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547478.5597255-268-101524713874198/AnsiballZ_setup.py'
Jan 27 20:57:58 compute-1 sudo[63076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:59 compute-1 python3.9[63078]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:57:59 compute-1 sudo[63076]: pam_unix(sudo:session): session closed for user root
Jan 27 20:57:59 compute-1 sudo[63230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrcqztpsrulqfzsvzgszatssprxoeqlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547479.4070706-284-64262736053641/AnsiballZ_stat.py'
Jan 27 20:57:59 compute-1 sudo[63230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:57:59 compute-1 python3.9[63232]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:57:59 compute-1 sudo[63230]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:00 compute-1 sudo[63382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oprcjrlzbkncxpzvtxapqqhsakqgfavr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547480.1488898-302-226349791857864/AnsiballZ_stat.py'
Jan 27 20:58:00 compute-1 sudo[63382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:00 compute-1 python3.9[63384]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:58:00 compute-1 sudo[63382]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:01 compute-1 sudo[63534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbpvfliuhdduwfrdobywwtssmqzfqlww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547481.002255-322-62983402026426/AnsiballZ_command.py'
Jan 27 20:58:01 compute-1 sudo[63534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:01 compute-1 python3.9[63536]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:58:01 compute-1 sudo[63534]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:02 compute-1 sudo[63687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lplviocstgcsraxkwwiqclkcinoqxaoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547481.8189595-342-26743398069890/AnsiballZ_service_facts.py'
Jan 27 20:58:02 compute-1 sudo[63687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:02 compute-1 python3.9[63689]: ansible-service_facts Invoked
Jan 27 20:58:02 compute-1 network[63706]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 27 20:58:02 compute-1 network[63707]: 'network-scripts' will be removed from distribution in near future.
Jan 27 20:58:02 compute-1 network[63708]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 27 20:58:05 compute-1 sudo[63687]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:07 compute-1 sudo[63991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoewtgmlhucnspzpmrydhceqxufpmtgr ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769547487.505131-372-247487003592369/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769547487.505131-372-247487003592369/args'
Jan 27 20:58:07 compute-1 sudo[63991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:08 compute-1 sudo[63991]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:08 compute-1 sudo[64158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swvngafiwrrwhnlhznaaioslxykgpkni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547488.3232863-394-261285504590282/AnsiballZ_dnf.py'
Jan 27 20:58:08 compute-1 sudo[64158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:08 compute-1 python3.9[64160]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 20:58:10 compute-1 sudo[64158]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:11 compute-1 sudo[64311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yepcazticeykkhhbevyxdncinhpclltm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547490.5074265-420-151105259543499/AnsiballZ_package_facts.py'
Jan 27 20:58:11 compute-1 sudo[64311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:11 compute-1 python3.9[64313]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 27 20:58:11 compute-1 sudo[64311]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:12 compute-1 sudo[64463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgfadjwquqzkbibqhgwessksgbstzvxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547492.184756-440-77302939002013/AnsiballZ_stat.py'
Jan 27 20:58:12 compute-1 sudo[64463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:12 compute-1 python3.9[64465]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:12 compute-1 sudo[64463]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:13 compute-1 sudo[64588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lolkztcmbpaijqutiqbfiukqucusdhfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547492.184756-440-77302939002013/AnsiballZ_copy.py'
Jan 27 20:58:13 compute-1 sudo[64588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:13 compute-1 python3.9[64590]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547492.184756-440-77302939002013/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:13 compute-1 sudo[64588]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:13 compute-1 sudo[64742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcdekdqjgjzsqupcygeflpzucwnwkrug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547493.6022227-471-232675043810796/AnsiballZ_stat.py'
Jan 27 20:58:13 compute-1 sudo[64742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:14 compute-1 python3.9[64744]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:14 compute-1 sudo[64742]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:14 compute-1 sudo[64867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bckcnmgpsmlzsdafbgmwmbmknygdgjcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547493.6022227-471-232675043810796/AnsiballZ_copy.py'
Jan 27 20:58:14 compute-1 sudo[64867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:14 compute-1 python3.9[64869]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547493.6022227-471-232675043810796/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:14 compute-1 sudo[64867]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:16 compute-1 sudo[65021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zabtudxrdvbqhhwfprftoyzqybsjilsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547495.6125386-513-204598697442518/AnsiballZ_lineinfile.py'
Jan 27 20:58:16 compute-1 sudo[65021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:16 compute-1 python3.9[65023]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:16 compute-1 sudo[65021]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:18 compute-1 sudo[65175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klxibqtffynemjqtubfccvadhmjpevus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547498.0931506-543-135454938732093/AnsiballZ_setup.py'
Jan 27 20:58:18 compute-1 sudo[65175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:18 compute-1 python3.9[65177]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:58:18 compute-1 sudo[65175]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:19 compute-1 sudo[65259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngubjdlnfdujmqkpyqimijczpxcntqsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547498.0931506-543-135454938732093/AnsiballZ_systemd.py'
Jan 27 20:58:19 compute-1 sudo[65259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:19 compute-1 python3.9[65261]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:58:19 compute-1 sudo[65259]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:21 compute-1 sudo[65413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkphqcnbvxjwzqmkygcbttctesksjllz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547501.4981122-575-141237377050637/AnsiballZ_setup.py'
Jan 27 20:58:21 compute-1 sudo[65413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:22 compute-1 python3.9[65415]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 20:58:22 compute-1 sudo[65413]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:22 compute-1 sudo[65497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkqpaalodhuuqzvnvotjmmrnxrtkjunj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547501.4981122-575-141237377050637/AnsiballZ_systemd.py'
Jan 27 20:58:22 compute-1 sudo[65497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:22 compute-1 python3.9[65499]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:58:22 compute-1 chronyd[800]: chronyd exiting
Jan 27 20:58:22 compute-1 systemd[1]: Stopping NTP client/server...
Jan 27 20:58:22 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Jan 27 20:58:22 compute-1 systemd[1]: Stopped NTP client/server.
Jan 27 20:58:22 compute-1 systemd[1]: Starting NTP client/server...
Jan 27 20:58:22 compute-1 chronyd[65508]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 27 20:58:22 compute-1 chronyd[65508]: Frequency -24.796 +/- 0.343 ppm read from /var/lib/chrony/drift
Jan 27 20:58:22 compute-1 chronyd[65508]: Loaded seccomp filter (level 2)
Jan 27 20:58:22 compute-1 systemd[1]: Started NTP client/server.
Jan 27 20:58:22 compute-1 sudo[65497]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:23 compute-1 sshd-session[60653]: Connection closed by 192.168.122.30 port 42284
Jan 27 20:58:23 compute-1 sshd-session[60650]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:58:23 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Jan 27 20:58:23 compute-1 systemd[1]: session-15.scope: Consumed 26.693s CPU time.
Jan 27 20:58:23 compute-1 systemd-logind[786]: Session 15 logged out. Waiting for processes to exit.
Jan 27 20:58:23 compute-1 systemd-logind[786]: Removed session 15.
Jan 27 20:58:29 compute-1 sshd-session[65534]: Accepted publickey for zuul from 192.168.122.30 port 47892 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:58:29 compute-1 systemd-logind[786]: New session 16 of user zuul.
Jan 27 20:58:29 compute-1 systemd[1]: Started Session 16 of User zuul.
Jan 27 20:58:29 compute-1 sshd-session[65534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:58:30 compute-1 python3.9[65687]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:58:31 compute-1 sudo[65841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozphnemkbqnqxstkapjjkeaauioyiqus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547511.0745704-42-29190433996393/AnsiballZ_file.py'
Jan 27 20:58:31 compute-1 sudo[65841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:31 compute-1 python3.9[65843]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:31 compute-1 sudo[65841]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:32 compute-1 sudo[66016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upicjjvbjtdyecuslfdbhrmvdyfzrpme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547512.0147238-58-186787179631262/AnsiballZ_stat.py'
Jan 27 20:58:32 compute-1 sudo[66016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:32 compute-1 python3.9[66018]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:32 compute-1 sudo[66016]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:32 compute-1 sudo[66094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlqvdugyokjylqzxxcrngzrmdcdpnjmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547512.0147238-58-186787179631262/AnsiballZ_file.py'
Jan 27 20:58:32 compute-1 sudo[66094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:33 compute-1 python3.9[66096]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.1txik47_ recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:33 compute-1 sudo[66094]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:34 compute-1 sudo[66246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alnhotpxendinggtndbudovencluyort ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547513.746693-98-76309897261924/AnsiballZ_stat.py'
Jan 27 20:58:34 compute-1 sudo[66246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:34 compute-1 python3.9[66248]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:34 compute-1 sudo[66246]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:34 compute-1 sudo[66369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubkjqzvrzhstlbphphphfrnfynsfrwtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547513.746693-98-76309897261924/AnsiballZ_copy.py'
Jan 27 20:58:34 compute-1 sudo[66369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:35 compute-1 python3.9[66371]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547513.746693-98-76309897261924/.source _original_basename=.qjcff0g8 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:35 compute-1 sudo[66369]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:35 compute-1 sudo[66521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwtjacefwzpqtkihcpjcjqqvyzclwset ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547515.2606852-130-230409750488353/AnsiballZ_file.py'
Jan 27 20:58:35 compute-1 sudo[66521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:35 compute-1 python3.9[66523]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:58:35 compute-1 sudo[66521]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:36 compute-1 sudo[66673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofsxmabqnuprhyruzrhwqzqwulysthhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547515.9788601-146-267925389209709/AnsiballZ_stat.py'
Jan 27 20:58:36 compute-1 sudo[66673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:36 compute-1 python3.9[66675]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:36 compute-1 sudo[66673]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:36 compute-1 sudo[66796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miwbyshdpgbwvbmxuqiswgqmttjhjbdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547515.9788601-146-267925389209709/AnsiballZ_copy.py'
Jan 27 20:58:36 compute-1 sudo[66796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:37 compute-1 python3.9[66798]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547515.9788601-146-267925389209709/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:58:37 compute-1 sudo[66796]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:37 compute-1 sudo[66948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntgahobfmlqwogzhpeqwlbwuyitwcjyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547517.1588898-146-98644310390046/AnsiballZ_stat.py'
Jan 27 20:58:37 compute-1 sudo[66948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:37 compute-1 python3.9[66950]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:37 compute-1 sudo[66948]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:38 compute-1 sudo[67071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxgurjggodpakaisvpxyquswkobrlszz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547517.1588898-146-98644310390046/AnsiballZ_copy.py'
Jan 27 20:58:38 compute-1 sudo[67071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:38 compute-1 python3.9[67073]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547517.1588898-146-98644310390046/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 20:58:38 compute-1 sudo[67071]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:38 compute-1 sudo[67224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmbnkthskxylrnrddbfzuxjkemdsyocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547518.4998353-204-169736140171319/AnsiballZ_file.py'
Jan 27 20:58:38 compute-1 sudo[67224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:39 compute-1 python3.9[67226]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:39 compute-1 sudo[67224]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:39 compute-1 sudo[67376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzcajgxdzghcjrlyheqqwwegmzbvscyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547519.2660303-220-82247757341142/AnsiballZ_stat.py'
Jan 27 20:58:39 compute-1 sudo[67376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:39 compute-1 python3.9[67378]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:39 compute-1 sudo[67376]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:40 compute-1 sudo[67499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzkovyqgylqlftrwvbvzgxaxpbtiavaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547519.2660303-220-82247757341142/AnsiballZ_copy.py'
Jan 27 20:58:40 compute-1 sudo[67499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:40 compute-1 python3.9[67501]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547519.2660303-220-82247757341142/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:40 compute-1 sudo[67499]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:41 compute-1 sudo[67651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnsxuymovcfickfafrpvsszutuualjzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547520.7526255-250-236560274270114/AnsiballZ_stat.py'
Jan 27 20:58:41 compute-1 sudo[67651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:41 compute-1 python3.9[67653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:41 compute-1 sudo[67651]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:42 compute-1 sudo[67774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryoxbvkrjksfnadlhkwtmlavpkvdelsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547520.7526255-250-236560274270114/AnsiballZ_copy.py'
Jan 27 20:58:42 compute-1 sudo[67774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:42 compute-1 python3.9[67776]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547520.7526255-250-236560274270114/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:42 compute-1 sudo[67774]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:43 compute-1 sudo[67926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwwqxsnqsplsyeuaywftgexmzaaxixkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547522.4494793-280-249118002184824/AnsiballZ_systemd.py'
Jan 27 20:58:43 compute-1 sudo[67926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:43 compute-1 python3.9[67928]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:58:43 compute-1 systemd[1]: Reloading.
Jan 27 20:58:43 compute-1 systemd-sysv-generator[67955]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:58:43 compute-1 systemd-rc-local-generator[67951]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:58:43 compute-1 systemd[1]: Reloading.
Jan 27 20:58:43 compute-1 systemd-rc-local-generator[67993]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:58:43 compute-1 systemd-sysv-generator[67997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:58:43 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Jan 27 20:58:43 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Jan 27 20:58:43 compute-1 sudo[67926]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:44 compute-1 sudo[68156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjzqnwrhjqmpwczadxljcjyeiktrhshi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547524.1583385-296-10432266667105/AnsiballZ_stat.py'
Jan 27 20:58:44 compute-1 sudo[68156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:44 compute-1 python3.9[68158]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:44 compute-1 sudo[68156]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:45 compute-1 sudo[68279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aymaikqltipwbuajtilbgntoriifwlst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547524.1583385-296-10432266667105/AnsiballZ_copy.py'
Jan 27 20:58:45 compute-1 sudo[68279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:45 compute-1 python3.9[68281]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547524.1583385-296-10432266667105/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:45 compute-1 sudo[68279]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:45 compute-1 sudo[68431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbcicybcbpoewmlyrtgttarnrlxauvjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547525.7189717-326-185334142294842/AnsiballZ_stat.py'
Jan 27 20:58:45 compute-1 sudo[68431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:46 compute-1 python3.9[68433]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:46 compute-1 sudo[68431]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:46 compute-1 sudo[68554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dydusiotmodhijgvhzpxflfkyfyooabj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547525.7189717-326-185334142294842/AnsiballZ_copy.py'
Jan 27 20:58:46 compute-1 sudo[68554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:46 compute-1 python3.9[68556]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547525.7189717-326-185334142294842/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:46 compute-1 sudo[68554]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:47 compute-1 sudo[68706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wglpezkhdwxjjjmbzrkcvhhqckqlygsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547527.010162-356-232661505639807/AnsiballZ_systemd.py'
Jan 27 20:58:47 compute-1 sudo[68706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:47 compute-1 python3.9[68708]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:58:47 compute-1 systemd[1]: Reloading.
Jan 27 20:58:47 compute-1 systemd-rc-local-generator[68731]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:58:47 compute-1 systemd-sysv-generator[68736]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:58:47 compute-1 systemd[1]: Reloading.
Jan 27 20:58:47 compute-1 systemd-sysv-generator[68777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:58:47 compute-1 systemd-rc-local-generator[68773]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:58:48 compute-1 systemd[1]: Starting Create netns directory...
Jan 27 20:58:48 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 27 20:58:48 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 27 20:58:48 compute-1 systemd[1]: Finished Create netns directory.
Jan 27 20:58:48 compute-1 sudo[68706]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:49 compute-1 python3.9[68934]: ansible-ansible.builtin.service_facts Invoked
Jan 27 20:58:49 compute-1 network[68951]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 27 20:58:49 compute-1 network[68952]: 'network-scripts' will be removed from distribution in near future.
Jan 27 20:58:49 compute-1 network[68953]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 27 20:58:53 compute-1 sudo[69213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxyajlpallwbznjsjtdeiijwxqjwnfxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547533.657474-388-182903096557508/AnsiballZ_systemd.py'
Jan 27 20:58:53 compute-1 sudo[69213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:54 compute-1 python3.9[69215]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:58:54 compute-1 systemd[1]: Reloading.
Jan 27 20:58:54 compute-1 systemd-rc-local-generator[69245]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:58:54 compute-1 systemd-sysv-generator[69248]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:58:54 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 27 20:58:54 compute-1 iptables.init[69255]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 27 20:58:54 compute-1 iptables.init[69255]: iptables: Flushing firewall rules: [  OK  ]
Jan 27 20:58:54 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Jan 27 20:58:54 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 27 20:58:54 compute-1 sudo[69213]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:55 compute-1 sudo[69449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqcdsmtgdzhwtzfrndjdbbbpqvbbbxyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547535.1168342-388-111507246326796/AnsiballZ_systemd.py'
Jan 27 20:58:55 compute-1 sudo[69449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:55 compute-1 python3.9[69451]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:58:55 compute-1 sudo[69449]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:56 compute-1 sudo[69603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbwachenccbnklryzqflxkwbkndmxmyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547536.120055-420-69848056371826/AnsiballZ_systemd.py'
Jan 27 20:58:56 compute-1 sudo[69603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:56 compute-1 python3.9[69605]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 20:58:56 compute-1 systemd[1]: Reloading.
Jan 27 20:58:56 compute-1 systemd-sysv-generator[69637]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 20:58:56 compute-1 systemd-rc-local-generator[69633]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 20:58:56 compute-1 systemd[1]: Starting Netfilter Tables...
Jan 27 20:58:57 compute-1 systemd[1]: Finished Netfilter Tables.
Jan 27 20:58:57 compute-1 sudo[69603]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:57 compute-1 sudo[69795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iolxpsugkifldmmxrzdqksekqpymcxjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547537.4203105-436-47487581586767/AnsiballZ_command.py'
Jan 27 20:58:57 compute-1 sudo[69795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:58 compute-1 python3.9[69797]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:58:58 compute-1 sudo[69795]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:59 compute-1 sudo[69948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uewnwqxgdthevemglxyfydytelgfgbuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547538.6329985-464-133956446930282/AnsiballZ_stat.py'
Jan 27 20:58:59 compute-1 sudo[69948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:59 compute-1 python3.9[69950]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:58:59 compute-1 sudo[69948]: pam_unix(sudo:session): session closed for user root
Jan 27 20:58:59 compute-1 sudo[70073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txexbnwnofojwvdgixvmvyygnajxurxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547538.6329985-464-133956446930282/AnsiballZ_copy.py'
Jan 27 20:58:59 compute-1 sudo[70073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:58:59 compute-1 python3.9[70075]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547538.6329985-464-133956446930282/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:58:59 compute-1 sudo[70073]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:00 compute-1 sudo[70226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdkzcfjmcwunkoiemzbitpneqscsyiyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547540.263292-494-158895114825040/AnsiballZ_systemd.py'
Jan 27 20:59:00 compute-1 sudo[70226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:01 compute-1 python3.9[70228]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:59:01 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Jan 27 20:59:01 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Jan 27 20:59:01 compute-1 sshd[1007]: Received SIGHUP; restarting.
Jan 27 20:59:01 compute-1 sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 27 20:59:01 compute-1 sshd[1007]: Server listening on :: port 22.
Jan 27 20:59:01 compute-1 sudo[70226]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:02 compute-1 sudo[70382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naohbcjyrlzmufieuvkixsuuiisxiapi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547541.7701597-510-50837896601741/AnsiballZ_file.py'
Jan 27 20:59:02 compute-1 sudo[70382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:02 compute-1 python3.9[70384]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:02 compute-1 sudo[70382]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:02 compute-1 sudo[70534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzbframtetvueoyjguwimvujgzirsoex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547542.5104759-526-237674455690229/AnsiballZ_stat.py'
Jan 27 20:59:02 compute-1 sudo[70534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:03 compute-1 python3.9[70536]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:03 compute-1 sudo[70534]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:03 compute-1 sudo[70657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-httivvyjqgtsxikmqvhthldilxvrttyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547542.5104759-526-237674455690229/AnsiballZ_copy.py'
Jan 27 20:59:03 compute-1 sudo[70657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:03 compute-1 python3.9[70659]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547542.5104759-526-237674455690229/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:03 compute-1 sudo[70657]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:04 compute-1 sudo[70809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szuvavmxpnlbqrpmtexwuuubpfqdvydb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547544.0512588-562-138686268051149/AnsiballZ_timezone.py'
Jan 27 20:59:04 compute-1 sudo[70809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:04 compute-1 python3.9[70811]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 27 20:59:04 compute-1 systemd[1]: Starting Time & Date Service...
Jan 27 20:59:04 compute-1 systemd[1]: Started Time & Date Service.
Jan 27 20:59:04 compute-1 sudo[70809]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:06 compute-1 sudo[70965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtmqfcodhjpbypqvagsfusiasamltiiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547545.7894917-580-5521830276245/AnsiballZ_file.py'
Jan 27 20:59:06 compute-1 sudo[70965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:06 compute-1 python3.9[70967]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:06 compute-1 sudo[70965]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:06 compute-1 sudo[71117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jceykkokxmiihmitwrrxyrxotjfzvnlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547546.6423514-596-73840021057012/AnsiballZ_stat.py'
Jan 27 20:59:06 compute-1 sudo[71117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:07 compute-1 python3.9[71119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:07 compute-1 sudo[71117]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:07 compute-1 sudo[71240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhljfhbboqoxwbhypduzdfvawkznklqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547546.6423514-596-73840021057012/AnsiballZ_copy.py'
Jan 27 20:59:07 compute-1 sudo[71240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:07 compute-1 python3.9[71242]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547546.6423514-596-73840021057012/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:07 compute-1 sudo[71240]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:08 compute-1 sudo[71392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gccejtcvddbyafrhcoiypwllaeyjudji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547548.2115295-627-236306671748710/AnsiballZ_stat.py'
Jan 27 20:59:08 compute-1 sudo[71392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:08 compute-1 python3.9[71394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:08 compute-1 sudo[71392]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:09 compute-1 sudo[71515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwqoezwmuqwxxckqkiphevuzkdaitujm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547548.2115295-627-236306671748710/AnsiballZ_copy.py'
Jan 27 20:59:09 compute-1 sudo[71515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:09 compute-1 python3.9[71517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547548.2115295-627-236306671748710/.source.yaml _original_basename=.a026lj5s follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:09 compute-1 sudo[71515]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:10 compute-1 sudo[71667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mojyvdmickoedwfftqxoaffiwyoqifrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547549.7737172-656-210527713384771/AnsiballZ_stat.py'
Jan 27 20:59:10 compute-1 sudo[71667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:10 compute-1 python3.9[71669]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:10 compute-1 sudo[71667]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:10 compute-1 sudo[71790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nickxoihliybrcgastqkkpmjfjadthhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547549.7737172-656-210527713384771/AnsiballZ_copy.py'
Jan 27 20:59:10 compute-1 sudo[71790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:11 compute-1 python3.9[71792]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547549.7737172-656-210527713384771/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:11 compute-1 sudo[71790]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:11 compute-1 sudo[71942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cupjpmbqzvabmgaaggzefcyuaszccteg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547551.2463067-686-51939731230465/AnsiballZ_command.py'
Jan 27 20:59:11 compute-1 sudo[71942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:11 compute-1 python3.9[71944]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:59:11 compute-1 sudo[71942]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:12 compute-1 sudo[72095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqqsgahxsqldsxxqfyljawhowegsqclw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547551.9495652-702-36719479256666/AnsiballZ_command.py'
Jan 27 20:59:12 compute-1 sudo[72095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:12 compute-1 python3.9[72097]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:59:12 compute-1 sudo[72095]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:13 compute-1 sudo[72248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgkqoblkkquyrzxqghrshzelezzgkqqq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769547552.8362818-719-250504323518185/AnsiballZ_edpm_nftables_from_files.py'
Jan 27 20:59:13 compute-1 sudo[72248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:13 compute-1 python3[72250]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 27 20:59:13 compute-1 sudo[72248]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:14 compute-1 sudo[72400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgqutgeselmtnxtepwnzodwgqwwfytds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547553.847127-735-197668818303914/AnsiballZ_stat.py'
Jan 27 20:59:14 compute-1 sudo[72400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:14 compute-1 python3.9[72402]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:14 compute-1 sudo[72400]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:14 compute-1 sudo[72523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnuqhhwbdltkqwmfbhgabihotwbywwyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547553.847127-735-197668818303914/AnsiballZ_copy.py'
Jan 27 20:59:14 compute-1 sudo[72523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:15 compute-1 python3.9[72525]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547553.847127-735-197668818303914/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:15 compute-1 sudo[72523]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:15 compute-1 sudo[72675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suflibabybxgoebyhtodgzlsnsymmuue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547555.4358828-764-69607057675546/AnsiballZ_stat.py'
Jan 27 20:59:15 compute-1 sudo[72675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:16 compute-1 python3.9[72677]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:16 compute-1 sudo[72675]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:16 compute-1 sudo[72798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aucjaspdjroxuswcpeyzadxfvonnowqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547555.4358828-764-69607057675546/AnsiballZ_copy.py'
Jan 27 20:59:16 compute-1 sudo[72798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:16 compute-1 python3.9[72800]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547555.4358828-764-69607057675546/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:16 compute-1 sudo[72798]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:17 compute-1 sudo[72950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvkqqysndbwegjpeafotloutttpkegkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547556.8919787-794-10159087196303/AnsiballZ_stat.py'
Jan 27 20:59:17 compute-1 sudo[72950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:17 compute-1 python3.9[72952]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:17 compute-1 sudo[72950]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:17 compute-1 sudo[73073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oglrgzgcmvqeuttkamdebviqlmpatwxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547556.8919787-794-10159087196303/AnsiballZ_copy.py'
Jan 27 20:59:17 compute-1 sudo[73073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:17 compute-1 python3.9[73075]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547556.8919787-794-10159087196303/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:17 compute-1 sudo[73073]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:18 compute-1 sudo[73225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bypbdorbubppoivclplpzplamnauqrlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547558.2419136-824-157806269376062/AnsiballZ_stat.py'
Jan 27 20:59:18 compute-1 sudo[73225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:18 compute-1 python3.9[73227]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:18 compute-1 sudo[73225]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:19 compute-1 sudo[73348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjcespjasogcdogmwpbmtnnyutakjoha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547558.2419136-824-157806269376062/AnsiballZ_copy.py'
Jan 27 20:59:19 compute-1 sudo[73348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:19 compute-1 python3.9[73350]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547558.2419136-824-157806269376062/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:19 compute-1 sudo[73348]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:20 compute-1 sudo[73500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylupwjwhdfstinlrjzxlbkkjqwjkrzbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547559.7431412-854-276783436684684/AnsiballZ_stat.py'
Jan 27 20:59:20 compute-1 sudo[73500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:20 compute-1 python3.9[73502]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 20:59:20 compute-1 sudo[73500]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:21 compute-1 sudo[73623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-broxrlqppavgeeueljfoekgfnkquffxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547559.7431412-854-276783436684684/AnsiballZ_copy.py'
Jan 27 20:59:21 compute-1 sudo[73623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:21 compute-1 python3.9[73625]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547559.7431412-854-276783436684684/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:21 compute-1 sudo[73623]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:21 compute-1 sudo[73775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baxwwpxitlbxewtwomikbubbigqgmoht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547561.6788418-884-41841630428159/AnsiballZ_file.py'
Jan 27 20:59:21 compute-1 sudo[73775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:22 compute-1 python3.9[73777]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:22 compute-1 sudo[73775]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:22 compute-1 sudo[73927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppjlfrcmpbnlangbeafggaasydjdtzre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547562.431087-900-159943867944840/AnsiballZ_command.py'
Jan 27 20:59:22 compute-1 sudo[73927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:23 compute-1 python3.9[73929]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:59:23 compute-1 sudo[73927]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:23 compute-1 sudo[74086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwblcyldxgaxanflugawzxevlazebuun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547563.3183963-916-85693005835742/AnsiballZ_blockinfile.py'
Jan 27 20:59:23 compute-1 sudo[74086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:23 compute-1 python3.9[74088]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:23 compute-1 sudo[74086]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:24 compute-1 sudo[74239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwwoybgmnkhdsfjlslpzjrljnoqpbzuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547564.5218878-934-274690616146430/AnsiballZ_file.py'
Jan 27 20:59:24 compute-1 sudo[74239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:25 compute-1 python3.9[74241]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:25 compute-1 sudo[74239]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:25 compute-1 sudo[74391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hygbvufiudtoglwucadfiqalerbbgipw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547565.1703002-934-266634841567127/AnsiballZ_file.py'
Jan 27 20:59:25 compute-1 sudo[74391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:25 compute-1 python3.9[74393]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:25 compute-1 sudo[74391]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:26 compute-1 sudo[74543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyihkkkrdmlrarkrwmwjgxbqnugrgioq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547565.8940868-964-183228278926486/AnsiballZ_mount.py'
Jan 27 20:59:26 compute-1 sudo[74543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:26 compute-1 python3.9[74545]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 27 20:59:26 compute-1 sudo[74543]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:26 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 20:59:26 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 20:59:27 compute-1 sudo[74697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esreybdefxergonkhmchnmlkcazygilx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547566.8491225-964-10258220371310/AnsiballZ_mount.py'
Jan 27 20:59:27 compute-1 sudo[74697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:27 compute-1 python3.9[74699]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 27 20:59:27 compute-1 sudo[74697]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:27 compute-1 sshd-session[65537]: Connection closed by 192.168.122.30 port 47892
Jan 27 20:59:27 compute-1 sshd-session[65534]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:59:27 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Jan 27 20:59:27 compute-1 systemd[1]: session-16.scope: Consumed 37.957s CPU time.
Jan 27 20:59:27 compute-1 systemd-logind[786]: Session 16 logged out. Waiting for processes to exit.
Jan 27 20:59:27 compute-1 systemd-logind[786]: Removed session 16.
Jan 27 20:59:34 compute-1 sshd-session[74725]: Accepted publickey for zuul from 192.168.122.30 port 59258 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:59:34 compute-1 systemd-logind[786]: New session 17 of user zuul.
Jan 27 20:59:34 compute-1 systemd[1]: Started Session 17 of User zuul.
Jan 27 20:59:34 compute-1 sshd-session[74725]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:59:34 compute-1 sudo[74878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmphpwnwpcjvpumqwwxkatlvpdibdhwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547574.2433128-18-23705930884555/AnsiballZ_tempfile.py'
Jan 27 20:59:34 compute-1 sudo[74878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:34 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 27 20:59:34 compute-1 python3.9[74880]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 27 20:59:35 compute-1 sudo[74878]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:35 compute-1 sudo[75032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqdgewmkikcsfhaywlgcwawzqvybkcaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547575.1666746-42-19221251582232/AnsiballZ_stat.py'
Jan 27 20:59:35 compute-1 sudo[75032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:35 compute-1 python3.9[75034]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:59:35 compute-1 sudo[75032]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:36 compute-1 sudo[75186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jevwifaozvsqsluhunodethofacdvqyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547576.1054826-62-59414940999515/AnsiballZ_setup.py'
Jan 27 20:59:36 compute-1 sudo[75186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:37 compute-1 python3.9[75188]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:59:37 compute-1 sudo[75186]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:37 compute-1 sshd-session[75059]: Invalid user solana from 80.94.92.186 port 59304
Jan 27 20:59:37 compute-1 sshd-session[75059]: Connection closed by invalid user solana 80.94.92.186 port 59304 [preauth]
Jan 27 20:59:38 compute-1 sudo[75338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nopxzqkrdnkqtkcajysdsphsllnpomxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547577.4814665-79-36137736831741/AnsiballZ_blockinfile.py'
Jan 27 20:59:38 compute-1 sudo[75338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:38 compute-1 python3.9[75340]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+ivP5iEXYmuTBBlgd0eBtVgNFZWpQhBVmPrGLkqk7eGRnkJrAOUZ2mEp5oPXgL3wa5gC3+td9hoXRm+preJuAN/xBg8Mgt8gRhQ9TcSen9qGtOofAcaafrNBhpKLSfr0owA84Rd04wiVYzVZLglbEG5jBmC6IUdch8cPlqOnMF6FzAw7PrDFAhRjzsoOZGgmolmOoB4J6VjVN+R3+lvAlyS7i3Kc9vdM4QZsebz++yTbTi3K1ppTuMZIIfqBuTxmkC4qEkCD9LEsG+Bz/c1KLV17CqVwi88oOX1PMVhWV4CBYL+s4Ll7cooeeJDfgbmHg/k5wFyk6/IogM1pkqOG9PwvSvUPGU9B8SV0+L1sdLbGwILTkJwC/T+L+1NKu9MyVHo4Urci98/1lL7HQ92eZx9TkAxXCa4dJs5XIthvyf6mw854svhHGXsKgPkL0jZUMLfUSFSMWbPgWeukGIA1lKuTcszN9IN1lboug7j8+Wvvlfvh6/lmioO4Z7Rd52zU=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPZjP6IiFVbuVmcLWqpCnwCnIZP829JYVys+g9GXRQdh
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOdweAT5DUHnmuGXrcinMe+HaaLQ5/k/n/qAChPYRMg4RF2+ilkjoNJ9QZCog+mUQL5/sfq/M4ZjwnoojzJ+7F0=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMjWqEKU7RA4wBozcgK8kgglUpzAnQal976R32DGInjfHKGLngRGsbUUoqSXUZbUuoJKmj1g6RKdGFjEsW3RXYMMJwFw/LnRBFkxoSahJgnSjatUxtc1VZbcqrXh7Q4DRilz3FTfSnpHV9B2M17fgKfR4oP0DpzXgZUfVNQykFwTE5uz7M/j9arFHWxhBP/rU+bG07fYBM0qSkMkBLIyZ+0wVLLXyNcRy89ZUaLyufVJCOfrip11wBuyicChPNI6RKY3ZCApIHl2ood+V2ISktCNzMqZeuh1hj2LcWxjtYVJl0zvobOKCYCyE8l296FqVlwdVPwWuPT0y4pIUeSZhaxAm+10Wj48cyKJEoIIxeSIFaoHErZZt+oJA/Q/8thwEnJvLkymarYN9J/Yn8IeaWl9hu5NdKNqB19M81lcgvGieve6m+Vk4qM4l0JKoKbMvA4lf4XKlXOYxyJVUtBjkqeUvwUHq9ep9WW4ceyNMzmDEXpiXRF6Ucv0+Kf0VE4yc=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIELiHVEPmhpBmd1KBrvas2F4wjMYjT2Qk+HcrwGvvzUC
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOcEyN+xpMG3Tcwe8jCsmfMOweFs8KkOvo6QXnnjyODmWxjIfgKBKREfMZ1nlVUiq0aImaCJCovEdwVBCJy43/I=
                                             create=True mode=0644 path=/tmp/ansible.lc4ix9l7 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:38 compute-1 sudo[75338]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:39 compute-1 sudo[75490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abeoohkhtzcusgeiaurmhsqylfuyiyme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547578.5830042-95-25896774625792/AnsiballZ_command.py'
Jan 27 20:59:39 compute-1 sudo[75490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:39 compute-1 python3.9[75492]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.lc4ix9l7' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:59:39 compute-1 sudo[75490]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:39 compute-1 sudo[75644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pptqassrephwznlfmlpokkstyltbekoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547579.5502717-111-209548898235645/AnsiballZ_file.py'
Jan 27 20:59:39 compute-1 sudo[75644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:40 compute-1 python3.9[75646]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.lc4ix9l7 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:40 compute-1 sudo[75644]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:40 compute-1 sshd-session[74728]: Connection closed by 192.168.122.30 port 59258
Jan 27 20:59:40 compute-1 sshd-session[74725]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:59:40 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Jan 27 20:59:40 compute-1 systemd[1]: session-17.scope: Consumed 3.744s CPU time.
Jan 27 20:59:40 compute-1 systemd-logind[786]: Session 17 logged out. Waiting for processes to exit.
Jan 27 20:59:40 compute-1 systemd-logind[786]: Removed session 17.
Jan 27 20:59:46 compute-1 sshd-session[75671]: Accepted publickey for zuul from 192.168.122.30 port 45308 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:59:46 compute-1 systemd-logind[786]: New session 18 of user zuul.
Jan 27 20:59:46 compute-1 systemd[1]: Started Session 18 of User zuul.
Jan 27 20:59:46 compute-1 sshd-session[75671]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 20:59:47 compute-1 python3.9[75824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 20:59:48 compute-1 sudo[75978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzkbrvcrnhhojbczzxqagkhppphwwsmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547588.2464786-40-169484018414376/AnsiballZ_systemd.py'
Jan 27 20:59:48 compute-1 sudo[75978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:49 compute-1 python3.9[75980]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 27 20:59:49 compute-1 sudo[75978]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:49 compute-1 sudo[76132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnypuoofnmztkejtedshdwvmvuisxgia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547589.3652208-56-148563218921452/AnsiballZ_systemd.py'
Jan 27 20:59:49 compute-1 sudo[76132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:49 compute-1 python3.9[76134]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 20:59:49 compute-1 sudo[76132]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:50 compute-1 sudo[76285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjznxccpsbndqruubovuilcclydxpjxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547590.4013252-74-52346674597634/AnsiballZ_command.py'
Jan 27 20:59:50 compute-1 sudo[76285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:50 compute-1 python3.9[76287]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:59:51 compute-1 sudo[76285]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:51 compute-1 sudo[76438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmmmxisdchuejedzezxowxkusnjhgfja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547591.3270934-90-138659854136074/AnsiballZ_stat.py'
Jan 27 20:59:51 compute-1 sudo[76438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:51 compute-1 python3.9[76440]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 20:59:52 compute-1 sudo[76438]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:52 compute-1 sudo[76592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aldqrmusjpbjkphhodkyzepyfhvwrizc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547592.175511-106-59236128380514/AnsiballZ_command.py'
Jan 27 20:59:52 compute-1 sudo[76592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:52 compute-1 python3.9[76594]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 20:59:52 compute-1 sudo[76592]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:53 compute-1 sudo[76747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-libayhxilzfzbgvexijbtetvrsiuoule ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547592.981476-123-118274418362357/AnsiballZ_file.py'
Jan 27 20:59:53 compute-1 sudo[76747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 20:59:53 compute-1 python3.9[76749]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 20:59:53 compute-1 sudo[76747]: pam_unix(sudo:session): session closed for user root
Jan 27 20:59:53 compute-1 sshd-session[75674]: Connection closed by 192.168.122.30 port 45308
Jan 27 20:59:53 compute-1 sshd-session[75671]: pam_unix(sshd:session): session closed for user zuul
Jan 27 20:59:54 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Jan 27 20:59:54 compute-1 systemd[1]: session-18.scope: Consumed 4.685s CPU time.
Jan 27 20:59:54 compute-1 systemd-logind[786]: Session 18 logged out. Waiting for processes to exit.
Jan 27 20:59:54 compute-1 systemd-logind[786]: Removed session 18.
Jan 27 20:59:59 compute-1 sshd-session[76774]: Accepted publickey for zuul from 192.168.122.30 port 46088 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 20:59:59 compute-1 systemd-logind[786]: New session 19 of user zuul.
Jan 27 20:59:59 compute-1 systemd[1]: Started Session 19 of User zuul.
Jan 27 20:59:59 compute-1 sshd-session[76774]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:00:01 compute-1 python3.9[76927]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:00:02 compute-1 sudo[77081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmllkcbtrxzvywhuiwaiwqzhlsmkhlgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547601.7034314-44-177179142263541/AnsiballZ_setup.py'
Jan 27 21:00:02 compute-1 sudo[77081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:02 compute-1 python3.9[77083]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 21:00:02 compute-1 sudo[77081]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:02 compute-1 sudo[77165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpslxnlzjnbmddokppjdjcghixfusuvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547601.7034314-44-177179142263541/AnsiballZ_dnf.py'
Jan 27 21:00:02 compute-1 sudo[77165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:03 compute-1 python3.9[77167]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 27 21:00:04 compute-1 sudo[77165]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:05 compute-1 python3.9[77318]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:00:06 compute-1 python3.9[77469]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 27 21:00:07 compute-1 python3.9[77619]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:00:08 compute-1 python3.9[77769]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:00:08 compute-1 sshd-session[76777]: Connection closed by 192.168.122.30 port 46088
Jan 27 21:00:08 compute-1 sshd-session[76774]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:00:08 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Jan 27 21:00:08 compute-1 systemd[1]: session-19.scope: Consumed 6.342s CPU time.
Jan 27 21:00:08 compute-1 systemd-logind[786]: Session 19 logged out. Waiting for processes to exit.
Jan 27 21:00:08 compute-1 systemd-logind[786]: Removed session 19.
Jan 27 21:00:14 compute-1 sshd-session[77794]: Accepted publickey for zuul from 192.168.122.30 port 45806 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 21:00:14 compute-1 systemd-logind[786]: New session 20 of user zuul.
Jan 27 21:00:14 compute-1 systemd[1]: Started Session 20 of User zuul.
Jan 27 21:00:14 compute-1 sshd-session[77794]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:00:15 compute-1 python3.9[77947]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:00:17 compute-1 sudo[78101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyvyrfgocaonssqnnpotyqohyshhbwhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547616.5645194-76-91515756173599/AnsiballZ_file.py'
Jan 27 21:00:17 compute-1 sudo[78101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:17 compute-1 python3.9[78103]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:17 compute-1 sudo[78101]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:17 compute-1 sudo[78253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smfukctblmdgmxafllbbfdxqmqkqchcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547617.3956654-76-46524463268596/AnsiballZ_file.py'
Jan 27 21:00:17 compute-1 sudo[78253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:17 compute-1 python3.9[78255]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:17 compute-1 sudo[78253]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:18 compute-1 sudo[78405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pndfgukhtgrygjghxmzdqvcwhctazqsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547618.1610713-108-64778802770309/AnsiballZ_stat.py'
Jan 27 21:00:18 compute-1 sudo[78405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:18 compute-1 python3.9[78407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:18 compute-1 sudo[78405]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:19 compute-1 sudo[78528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjzrbalmcnzxgdtctggrypqplujcvlbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547618.1610713-108-64778802770309/AnsiballZ_copy.py'
Jan 27 21:00:19 compute-1 sudo[78528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:19 compute-1 python3.9[78530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547618.1610713-108-64778802770309/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=3bfc7b4dee97a92d67380d6586c6a932291a4628 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:19 compute-1 sudo[78528]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:20 compute-1 sudo[78680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrvgloolyjepvfpmrqwgpmdhadgasdlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547619.7684615-108-194987496752406/AnsiballZ_stat.py'
Jan 27 21:00:20 compute-1 sudo[78680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:20 compute-1 python3.9[78682]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:20 compute-1 sudo[78680]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:20 compute-1 sudo[78803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwhoexvbrrjamnrovxbzktlgeiwdaxsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547619.7684615-108-194987496752406/AnsiballZ_copy.py'
Jan 27 21:00:20 compute-1 sudo[78803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:20 compute-1 python3.9[78805]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547619.7684615-108-194987496752406/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=813bd44c0b81b65a7ab8497396bba87ff0986baf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:20 compute-1 sudo[78803]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:21 compute-1 sudo[78955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlnbgohjppgcagadbjojymhtpmbbhffy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547620.8998864-108-45945645431973/AnsiballZ_stat.py'
Jan 27 21:00:21 compute-1 sudo[78955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:21 compute-1 python3.9[78957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:21 compute-1 sudo[78955]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:21 compute-1 sudo[79078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmohjfkrrptcnkwjrtylkcabrlrmuntu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547620.8998864-108-45945645431973/AnsiballZ_copy.py'
Jan 27 21:00:21 compute-1 sudo[79078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:22 compute-1 python3.9[79080]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547620.8998864-108-45945645431973/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=6ac7d90dfb9a49045ceec9161f9156675227c9d6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:22 compute-1 sudo[79078]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:22 compute-1 sudo[79230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-supcnsytwmtzrsavhukukbxuucnjgota ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547622.2199438-191-234991860052715/AnsiballZ_file.py'
Jan 27 21:00:22 compute-1 sudo[79230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:22 compute-1 python3.9[79232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:22 compute-1 sudo[79230]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:23 compute-1 sudo[79382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fryyfaazaktirphivjatpkpvluwixjol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547622.8568463-191-86202166770963/AnsiballZ_file.py'
Jan 27 21:00:23 compute-1 sudo[79382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:23 compute-1 python3.9[79384]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:23 compute-1 sudo[79382]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:23 compute-1 sudo[79534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akhmozgweexsbwpzrctvzkmlizwvkvoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547623.5029845-222-237502532687083/AnsiballZ_stat.py'
Jan 27 21:00:23 compute-1 sudo[79534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:23 compute-1 python3.9[79536]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:23 compute-1 sudo[79534]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:24 compute-1 sudo[79657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwdognkssbubvjslfhjywdwtkchktvky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547623.5029845-222-237502532687083/AnsiballZ_copy.py'
Jan 27 21:00:24 compute-1 sudo[79657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:24 compute-1 python3.9[79659]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547623.5029845-222-237502532687083/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=e8ee150800b0ce884725b566f1026c1415378c4f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:24 compute-1 sudo[79657]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:25 compute-1 sudo[79809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuyimxefdioozzegbsivitwizmdlpdtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547624.8621764-222-196617356619708/AnsiballZ_stat.py'
Jan 27 21:00:25 compute-1 sudo[79809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:25 compute-1 python3.9[79811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:25 compute-1 sudo[79809]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:25 compute-1 sudo[79932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdivhmszvosfdffshamngcikxlwfnxej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547624.8621764-222-196617356619708/AnsiballZ_copy.py'
Jan 27 21:00:25 compute-1 sudo[79932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:25 compute-1 python3.9[79934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547624.8621764-222-196617356619708/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b598035c86178d2ef949902fce1149963d22177d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:25 compute-1 sudo[79932]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:26 compute-1 sudo[80084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcibmmfotfulwuvgvkaqfgjzbesknhhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547626.091038-222-55740476081119/AnsiballZ_stat.py'
Jan 27 21:00:26 compute-1 sudo[80084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:26 compute-1 python3.9[80086]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:26 compute-1 sudo[80084]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:26 compute-1 sudo[80207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewtvgddmkgipufzklnsqhvljfulijkya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547626.091038-222-55740476081119/AnsiballZ_copy.py'
Jan 27 21:00:26 compute-1 sudo[80207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:27 compute-1 python3.9[80209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547626.091038-222-55740476081119/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=e76f583249c61574aff9d62492f7f980bd906e5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:27 compute-1 sudo[80207]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:27 compute-1 sudo[80359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpfhardhoimkdxshjfpymmmqrdurfkck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547627.4528735-309-225671932108879/AnsiballZ_file.py'
Jan 27 21:00:27 compute-1 sudo[80359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:27 compute-1 python3.9[80361]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:28 compute-1 sudo[80359]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:28 compute-1 sudo[80511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exdplrwfkhvhkqsneoafrqmedwjcgvnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547628.1810064-309-78474102845757/AnsiballZ_file.py'
Jan 27 21:00:28 compute-1 sudo[80511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:28 compute-1 python3.9[80513]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:28 compute-1 sudo[80511]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:29 compute-1 sudo[80663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iidboegqtdgovvbkdpupqtzkqidijfsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547629.088484-346-272977455033243/AnsiballZ_stat.py'
Jan 27 21:00:29 compute-1 sudo[80663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:29 compute-1 python3.9[80665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:29 compute-1 sudo[80663]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:30 compute-1 sudo[80786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elomybyoklvjhvqurflgaldqtchvruyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547629.088484-346-272977455033243/AnsiballZ_copy.py'
Jan 27 21:00:30 compute-1 sudo[80786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:30 compute-1 python3.9[80788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547629.088484-346-272977455033243/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=e0aad3397554f4a4b2bb275c45d5f851e96dc74e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:30 compute-1 sudo[80786]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:31 compute-1 sudo[80938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wggetejacxqdraeyvdrabkmtzyrkjpcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547630.6615882-346-271201254591127/AnsiballZ_stat.py'
Jan 27 21:00:31 compute-1 sudo[80938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:31 compute-1 python3.9[80940]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:31 compute-1 sudo[80938]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:31 compute-1 sudo[81061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvyaxejuimxxzdestcroxcuogzdpipgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547630.6615882-346-271201254591127/AnsiballZ_copy.py'
Jan 27 21:00:31 compute-1 sudo[81061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:31 compute-1 python3.9[81063]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547630.6615882-346-271201254591127/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=6d8b575e3f2fff68a3ea35511c18a3dabc7f8095 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:31 compute-1 sudo[81061]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:32 compute-1 sudo[81213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxiupcoukjyxoamhdudazdofofiboqiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547632.037965-346-164319970959961/AnsiballZ_stat.py'
Jan 27 21:00:32 compute-1 sudo[81213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:32 compute-1 python3.9[81215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:32 compute-1 sudo[81213]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:32 compute-1 chronyd[65508]: Selected source 149.56.19.163 (pool.ntp.org)
Jan 27 21:00:32 compute-1 sudo[81336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhdqsqclzgjtycwnbtrmpjfnikgehakz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547632.037965-346-164319970959961/AnsiballZ_copy.py'
Jan 27 21:00:33 compute-1 sudo[81336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:33 compute-1 python3.9[81338]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547632.037965-346-164319970959961/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9493aa59131cda7f389ab1ea580c5d4d273e76b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:33 compute-1 sudo[81336]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:33 compute-1 sudo[81488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moahwjbwjuvdodukpbmrugthsukpikwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547633.495944-436-32259207713427/AnsiballZ_file.py'
Jan 27 21:00:33 compute-1 sudo[81488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:34 compute-1 python3.9[81490]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:34 compute-1 sudo[81488]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:34 compute-1 sudo[81640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foudbzgbwumfkyxgbrsocoxybclkpmvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547634.149194-436-23198591404393/AnsiballZ_file.py'
Jan 27 21:00:34 compute-1 sudo[81640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:34 compute-1 python3.9[81642]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:34 compute-1 sudo[81640]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:35 compute-1 sudo[81792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxkboxrnktkfopbuevvuygwwmqkonoey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547635.0091488-466-35985078519904/AnsiballZ_stat.py'
Jan 27 21:00:35 compute-1 sudo[81792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:35 compute-1 python3.9[81794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:35 compute-1 sudo[81792]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:35 compute-1 sudo[81915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odpcbzgjtytvdwivvuvwcqigqqqaqfnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547635.0091488-466-35985078519904/AnsiballZ_copy.py'
Jan 27 21:00:35 compute-1 sudo[81915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:36 compute-1 python3.9[81917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547635.0091488-466-35985078519904/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=38d0d35ebd231408e4ad47b3890d343a0398ff42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:36 compute-1 sudo[81915]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:36 compute-1 sudo[82067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snkjprgsdleuyolybkwqnnykouuerdun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547636.3328345-466-21642986314977/AnsiballZ_stat.py'
Jan 27 21:00:36 compute-1 sudo[82067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:36 compute-1 python3.9[82069]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:36 compute-1 sudo[82067]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:37 compute-1 sudo[82190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrgnenxvqmhfrujykbmqjctdxepssqbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547636.3328345-466-21642986314977/AnsiballZ_copy.py'
Jan 27 21:00:37 compute-1 sudo[82190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:37 compute-1 python3.9[82192]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547636.3328345-466-21642986314977/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=6d8b575e3f2fff68a3ea35511c18a3dabc7f8095 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:37 compute-1 sudo[82190]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:38 compute-1 sudo[82342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycecuzuelceqixcyvvepaaeazycfiovx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547637.6279602-466-270771133423084/AnsiballZ_stat.py'
Jan 27 21:00:38 compute-1 sudo[82342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:38 compute-1 python3.9[82344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:38 compute-1 sudo[82342]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:38 compute-1 sudo[82465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntlhmwpudizvwesbkjymazeqvctrmnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547637.6279602-466-270771133423084/AnsiballZ_copy.py'
Jan 27 21:00:38 compute-1 sudo[82465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:38 compute-1 python3.9[82467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547637.6279602-466-270771133423084/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=b0d1b8185e16ec9b0ced7c440e523e8bb215cd85 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:38 compute-1 sudo[82465]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:40 compute-1 sudo[82617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csvlpsijsaslngtfsqhrbnhhxgonhxmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547639.7313366-589-167392784785017/AnsiballZ_file.py'
Jan 27 21:00:40 compute-1 sudo[82617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:40 compute-1 python3.9[82619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:40 compute-1 sudo[82617]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:40 compute-1 sudo[82769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyjzgnjfqyciutvmpdntlqlqjeucpfxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547640.463019-605-266855397273733/AnsiballZ_stat.py'
Jan 27 21:00:40 compute-1 sudo[82769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:40 compute-1 python3.9[82771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:40 compute-1 sudo[82769]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:41 compute-1 sudo[82892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okwabahilwurdmrndiyvfjpoipkoczue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547640.463019-605-266855397273733/AnsiballZ_copy.py'
Jan 27 21:00:41 compute-1 sudo[82892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:41 compute-1 python3.9[82894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547640.463019-605-266855397273733/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:41 compute-1 sudo[82892]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:42 compute-1 sudo[83044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgsilkmgfpfeutvsscxwxkuldjemjrbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547641.8628566-638-215895007545881/AnsiballZ_file.py'
Jan 27 21:00:42 compute-1 sudo[83044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:42 compute-1 python3.9[83046]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:42 compute-1 sudo[83044]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:42 compute-1 sudo[83196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfznaxzsouahuvcdmiduelddoaxwzaih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547642.6644933-657-125629475311526/AnsiballZ_stat.py'
Jan 27 21:00:42 compute-1 sudo[83196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:43 compute-1 python3.9[83198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:43 compute-1 sudo[83196]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:43 compute-1 sudo[83319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emusjzvkhuxacxgzcdftsqcbdrwbiltz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547642.6644933-657-125629475311526/AnsiballZ_copy.py'
Jan 27 21:00:43 compute-1 sudo[83319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:43 compute-1 python3.9[83321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547642.6644933-657-125629475311526/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:43 compute-1 sudo[83319]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:44 compute-1 sudo[83471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmgrzrzdqihtqgvjculcmfyblvtstfql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547643.9840028-688-116383635051780/AnsiballZ_file.py'
Jan 27 21:00:44 compute-1 sudo[83471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:44 compute-1 python3.9[83473]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:44 compute-1 sudo[83471]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:44 compute-1 sudo[83623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjkxzzugsydnuyyhuqmlyzartapxlqab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547644.6386359-705-203029663121754/AnsiballZ_stat.py'
Jan 27 21:00:44 compute-1 sudo[83623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:45 compute-1 python3.9[83625]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:45 compute-1 sudo[83623]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:45 compute-1 sudo[83746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoujfjznegapbmfrtejlhhxkbpynidll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547644.6386359-705-203029663121754/AnsiballZ_copy.py'
Jan 27 21:00:45 compute-1 sudo[83746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:45 compute-1 python3.9[83748]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547644.6386359-705-203029663121754/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:45 compute-1 sudo[83746]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:46 compute-1 sudo[83898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjporouvolpfxocdbyoxdgkzvgarszzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547645.9861603-738-249223884124232/AnsiballZ_file.py'
Jan 27 21:00:46 compute-1 sudo[83898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:46 compute-1 python3.9[83900]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:46 compute-1 sudo[83898]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:47 compute-1 sudo[84050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phlddjailwbefqxqmkywwyuqwxzenemq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547646.6842082-753-218784823573456/AnsiballZ_stat.py'
Jan 27 21:00:47 compute-1 sudo[84050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:47 compute-1 python3.9[84052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:47 compute-1 sudo[84050]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:47 compute-1 sudo[84173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvlizfjeitngxpumyyvgtdbuqzvuaxdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547646.6842082-753-218784823573456/AnsiballZ_copy.py'
Jan 27 21:00:47 compute-1 sudo[84173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:47 compute-1 python3.9[84175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547646.6842082-753-218784823573456/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:47 compute-1 sudo[84173]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:48 compute-1 sudo[84325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uupdwnbcmxinpvkyqmxadyifuhmsdffv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547648.109383-786-2009368943909/AnsiballZ_file.py'
Jan 27 21:00:48 compute-1 sudo[84325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:48 compute-1 python3.9[84327]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:48 compute-1 sudo[84325]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:49 compute-1 sudo[84477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhlqljhmyzhwrkedapfzmrorazfqdsuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547648.8344576-802-136719990700931/AnsiballZ_stat.py'
Jan 27 21:00:49 compute-1 sudo[84477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:49 compute-1 python3.9[84479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:49 compute-1 sudo[84477]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:49 compute-1 sudo[84600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uslwamzmnwzbiidmxbwmdxwettznqfmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547648.8344576-802-136719990700931/AnsiballZ_copy.py'
Jan 27 21:00:49 compute-1 sudo[84600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:49 compute-1 python3.9[84602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547648.8344576-802-136719990700931/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:49 compute-1 sudo[84600]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:50 compute-1 sudo[84752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnyfuysmlitqajisupxsjcgrstxpznwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547650.0720463-834-162322242646066/AnsiballZ_file.py'
Jan 27 21:00:50 compute-1 sudo[84752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:50 compute-1 python3.9[84754]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:50 compute-1 sudo[84752]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:51 compute-1 sudo[84904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdczdtugajrqwbogxfkjncwazszsccve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547650.819787-849-214529654798014/AnsiballZ_stat.py'
Jan 27 21:00:51 compute-1 sudo[84904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:51 compute-1 python3.9[84906]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:51 compute-1 sudo[84904]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:51 compute-1 sudo[85027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efhdnqcqjnqhpphhzbvmrctzuxemcgfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547650.819787-849-214529654798014/AnsiballZ_copy.py'
Jan 27 21:00:51 compute-1 sudo[85027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:51 compute-1 python3.9[85029]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547650.819787-849-214529654798014/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:51 compute-1 sudo[85027]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:52 compute-1 sudo[85179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgouqljizvdgoffogrvqscldmhqownxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547652.1781354-884-178531665461572/AnsiballZ_file.py'
Jan 27 21:00:52 compute-1 sudo[85179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:52 compute-1 python3.9[85181]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:00:52 compute-1 sudo[85179]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:53 compute-1 sudo[85331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugobdtxkxmtahypsgrjjmilxnobfrixl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547652.9049182-898-576528558835/AnsiballZ_stat.py'
Jan 27 21:00:53 compute-1 sudo[85331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:53 compute-1 python3.9[85333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:00:53 compute-1 sudo[85331]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:53 compute-1 sudo[85454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogtylmwdeuttmpbysnkpbduylyqnhmrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547652.9049182-898-576528558835/AnsiballZ_copy.py'
Jan 27 21:00:53 compute-1 sudo[85454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:00:54 compute-1 python3.9[85456]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547652.9049182-898-576528558835/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3b607b21e610ea4b3099e30a5eee3e7772893bf1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:00:54 compute-1 sudo[85454]: pam_unix(sudo:session): session closed for user root
Jan 27 21:00:54 compute-1 sshd-session[77797]: Connection closed by 192.168.122.30 port 45806
Jan 27 21:00:54 compute-1 sshd-session[77794]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:00:54 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Jan 27 21:00:54 compute-1 systemd[1]: session-20.scope: Consumed 31.290s CPU time.
Jan 27 21:00:54 compute-1 systemd-logind[786]: Session 20 logged out. Waiting for processes to exit.
Jan 27 21:00:54 compute-1 systemd-logind[786]: Removed session 20.
Jan 27 21:00:59 compute-1 sshd-session[85481]: Accepted publickey for zuul from 192.168.122.30 port 47052 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 21:00:59 compute-1 systemd-logind[786]: New session 21 of user zuul.
Jan 27 21:00:59 compute-1 systemd[1]: Started Session 21 of User zuul.
Jan 27 21:00:59 compute-1 sshd-session[85481]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:01:00 compute-1 python3.9[85634]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:01:01 compute-1 sudo[85788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofmnhbgzrpdaryilleftyntfisjvmvae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547661.1566188-44-26499075263630/AnsiballZ_file.py'
Jan 27 21:01:01 compute-1 sudo[85788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:01 compute-1 CROND[85792]: (root) CMD (run-parts /etc/cron.hourly)
Jan 27 21:01:01 compute-1 run-parts[85795]: (/etc/cron.hourly) starting 0anacron
Jan 27 21:01:01 compute-1 python3.9[85790]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:01 compute-1 run-parts[85801]: (/etc/cron.hourly) finished 0anacron
Jan 27 21:01:01 compute-1 CROND[85791]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 27 21:01:01 compute-1 sudo[85788]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:02 compute-1 sudo[85951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubknkrjcpzlqnghvctkxgtspcnjwkfkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547661.9845166-44-74130192051320/AnsiballZ_file.py'
Jan 27 21:01:02 compute-1 sudo[85951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:02 compute-1 python3.9[85953]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:02 compute-1 sudo[85951]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:03 compute-1 python3.9[86103]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:01:03 compute-1 sudo[86253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnavotwjkeivummlxdogxrorugauonex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547663.5244777-90-172228276693060/AnsiballZ_seboolean.py'
Jan 27 21:01:03 compute-1 sudo[86253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:04 compute-1 python3.9[86255]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 27 21:01:05 compute-1 sudo[86253]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:05 compute-1 sudo[86409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdxzdhkpbbtpmjdxiuhfrguewaqguiuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547665.612715-110-257834468853487/AnsiballZ_setup.py'
Jan 27 21:01:05 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 27 21:01:05 compute-1 sudo[86409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:06 compute-1 python3.9[86411]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 21:01:06 compute-1 sudo[86409]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:07 compute-1 sudo[86493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmgcmynnkoepvgrqipgqnjxwfsudpfvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547665.612715-110-257834468853487/AnsiballZ_dnf.py'
Jan 27 21:01:07 compute-1 sudo[86493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:07 compute-1 python3.9[86495]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 21:01:08 compute-1 sudo[86493]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:09 compute-1 sudo[86646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzxagpjvqfcmklscadxbdishjhmafvav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547668.7556756-134-237417887019726/AnsiballZ_systemd.py'
Jan 27 21:01:09 compute-1 sudo[86646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:09 compute-1 python3.9[86648]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 21:01:09 compute-1 sudo[86646]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:10 compute-1 sudo[86801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzqatdskcoreobtswmhxlfjyqzihvfai ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769547669.9487627-150-35228078665908/AnsiballZ_edpm_nftables_snippet.py'
Jan 27 21:01:10 compute-1 sudo[86801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:10 compute-1 python3[86803]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 27 21:01:10 compute-1 sudo[86801]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:11 compute-1 sudo[86953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npdlxabpkscmpvrbseylxlyerrxfdzhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547670.9402952-168-189371758560898/AnsiballZ_file.py'
Jan 27 21:01:11 compute-1 sudo[86953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:11 compute-1 python3.9[86955]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:11 compute-1 sudo[86953]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:12 compute-1 sudo[87105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifuurvqdqvnggigqamgnhzemohmapzvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547671.6515112-184-6695508021044/AnsiballZ_stat.py'
Jan 27 21:01:12 compute-1 sudo[87105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:12 compute-1 python3.9[87107]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:12 compute-1 sudo[87105]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:12 compute-1 sudo[87183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppjanxzvfcoqxarkomvkljbqkxwvexhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547671.6515112-184-6695508021044/AnsiballZ_file.py'
Jan 27 21:01:12 compute-1 sudo[87183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:12 compute-1 python3.9[87185]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:12 compute-1 sudo[87183]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:13 compute-1 sudo[87335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clzfcexxpwioprbnibxusrkzjaprjuit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547672.9974794-208-134282366819466/AnsiballZ_stat.py'
Jan 27 21:01:13 compute-1 sudo[87335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:13 compute-1 python3.9[87337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:13 compute-1 sudo[87335]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:13 compute-1 sudo[87413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhnbzfhptteczhwqcflwdaghdjwfrpqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547672.9974794-208-134282366819466/AnsiballZ_file.py'
Jan 27 21:01:13 compute-1 sudo[87413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:13 compute-1 python3.9[87415]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.r3hb71eq recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:13 compute-1 sudo[87413]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:14 compute-1 sudo[87565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikxgtwfpxkodpfeouuzwhylxrquvrlqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547674.1741688-232-158381092194569/AnsiballZ_stat.py'
Jan 27 21:01:14 compute-1 sudo[87565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:14 compute-1 python3.9[87567]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:14 compute-1 sudo[87565]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:14 compute-1 sudo[87643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdwodslrlmvjkzzmueezzuxquxviyhoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547674.1741688-232-158381092194569/AnsiballZ_file.py'
Jan 27 21:01:14 compute-1 sudo[87643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:15 compute-1 python3.9[87645]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:15 compute-1 sudo[87643]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:15 compute-1 sudo[87795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruexqzdfmgpmnqpinguusjnveknkdmbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547675.3384268-258-218127942876853/AnsiballZ_command.py'
Jan 27 21:01:15 compute-1 sudo[87795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:16 compute-1 python3.9[87797]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:01:16 compute-1 sudo[87795]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:16 compute-1 sudo[87948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mquhtcwkafmlmmcpubjbqfvqbrvvwztt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769547676.4073272-274-26377141189606/AnsiballZ_edpm_nftables_from_files.py'
Jan 27 21:01:16 compute-1 sudo[87948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:17 compute-1 python3[87950]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 27 21:01:17 compute-1 sudo[87948]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:17 compute-1 sudo[88100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oljxyiompnpcqazpbzpxamyhzavfgvcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547677.352042-290-8289605591741/AnsiballZ_stat.py'
Jan 27 21:01:17 compute-1 sudo[88100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:17 compute-1 python3.9[88102]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:17 compute-1 sudo[88100]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:18 compute-1 sudo[88225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxqgaqxzdifjnswwhpyppzfbqyrnopbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547677.352042-290-8289605591741/AnsiballZ_copy.py'
Jan 27 21:01:18 compute-1 sudo[88225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:18 compute-1 python3.9[88227]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547677.352042-290-8289605591741/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:18 compute-1 sudo[88225]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:19 compute-1 sudo[88377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfipsxspsfdrxddqavbxvopjlgmsstfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547678.7587457-320-257092442960398/AnsiballZ_stat.py'
Jan 27 21:01:19 compute-1 sudo[88377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:19 compute-1 python3.9[88379]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:19 compute-1 sudo[88377]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:19 compute-1 sudo[88502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfucaeelnfrwgdsuvpyeafsrfkhsskyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547678.7587457-320-257092442960398/AnsiballZ_copy.py'
Jan 27 21:01:19 compute-1 sudo[88502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:19 compute-1 python3.9[88504]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547678.7587457-320-257092442960398/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:19 compute-1 sudo[88502]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:20 compute-1 sudo[88654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzveqnrxcvjenykxgyncgrpbcptzyvir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547680.1708653-350-213541648385907/AnsiballZ_stat.py'
Jan 27 21:01:20 compute-1 sudo[88654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:20 compute-1 python3.9[88656]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:20 compute-1 sudo[88654]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:21 compute-1 sudo[88779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tefrafiobkxaxirgfdficefqhvcuwsao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547680.1708653-350-213541648385907/AnsiballZ_copy.py'
Jan 27 21:01:21 compute-1 sudo[88779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:21 compute-1 python3.9[88781]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547680.1708653-350-213541648385907/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:21 compute-1 sudo[88779]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:21 compute-1 sudo[88931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxjtpbvdkiwrywoakfuooreevarnhjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547681.5432694-380-241832717134367/AnsiballZ_stat.py'
Jan 27 21:01:21 compute-1 sudo[88931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:21 compute-1 python3.9[88933]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:22 compute-1 sudo[88931]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:22 compute-1 sudo[89056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cchnamjkughmdsjxmlqqawtxueayttio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547681.5432694-380-241832717134367/AnsiballZ_copy.py'
Jan 27 21:01:22 compute-1 sudo[89056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:22 compute-1 python3.9[89058]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547681.5432694-380-241832717134367/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:22 compute-1 sudo[89056]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:23 compute-1 sudo[89208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eswsscfrgthfovgsmjhgguywjeyyhxqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547683.0841408-410-77573698044319/AnsiballZ_stat.py'
Jan 27 21:01:23 compute-1 sudo[89208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:23 compute-1 python3.9[89210]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:23 compute-1 sudo[89208]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:24 compute-1 sudo[89333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvurwzsifmfxhfedihoecygqmqzmztpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547683.0841408-410-77573698044319/AnsiballZ_copy.py'
Jan 27 21:01:24 compute-1 sudo[89333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:24 compute-1 python3.9[89335]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769547683.0841408-410-77573698044319/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:24 compute-1 sudo[89333]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:24 compute-1 sudo[89485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnuqtfrhvfgchdqciwfbkqfidshroudy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547684.3987684-440-145259378166080/AnsiballZ_file.py'
Jan 27 21:01:24 compute-1 sudo[89485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:24 compute-1 python3.9[89487]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:24 compute-1 sudo[89485]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:25 compute-1 sudo[89637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngvzojbybsvpgwjpcthunfgspyvxyudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547685.218429-456-165147642516845/AnsiballZ_command.py'
Jan 27 21:01:25 compute-1 sudo[89637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:25 compute-1 python3.9[89639]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:01:25 compute-1 sudo[89637]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:26 compute-1 sudo[89792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sygxvdrkkltriofesiekrnlyxgsfyava ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547685.923453-472-173492817560079/AnsiballZ_blockinfile.py'
Jan 27 21:01:26 compute-1 sudo[89792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:26 compute-1 python3.9[89794]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:26 compute-1 sudo[89792]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:27 compute-1 sudo[89944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfhsgwnfqptyzsxoknwlevederarunqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547686.9604452-490-270455374069999/AnsiballZ_command.py'
Jan 27 21:01:27 compute-1 sudo[89944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:27 compute-1 python3.9[89946]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:01:27 compute-1 sudo[89944]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:28 compute-1 sudo[90097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykbgbooxwnhaluogxjsiutbukffhpvxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547687.719606-506-60156661475745/AnsiballZ_stat.py'
Jan 27 21:01:28 compute-1 sudo[90097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:28 compute-1 python3.9[90099]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:01:28 compute-1 sudo[90097]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:28 compute-1 sudo[90251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuxyngqjetapsaxppvghxkrknlzsyxdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547688.4278553-522-19072588692821/AnsiballZ_command.py'
Jan 27 21:01:28 compute-1 sudo[90251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:28 compute-1 python3.9[90253]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:01:28 compute-1 sudo[90251]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:29 compute-1 sudo[90406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgotoscoaumiigrqavbqvzbqttxsbsbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547689.1294274-538-7619667160683/AnsiballZ_file.py'
Jan 27 21:01:29 compute-1 sudo[90406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:30 compute-1 python3.9[90408]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:30 compute-1 sudo[90406]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:31 compute-1 python3.9[90558]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:01:32 compute-1 sudo[90709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzypdwdxrjtwmzwfoohojvixfeupejer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547692.0495677-618-137469642390286/AnsiballZ_command.py'
Jan 27 21:01:32 compute-1 sudo[90709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:32 compute-1 python3.9[90711]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:01:32 compute-1 ovs-vsctl[90712]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 27 21:01:32 compute-1 sudo[90709]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:33 compute-1 sudo[90862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oryvbbbttlsclcvjxqbwugbhkldelzxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547692.9525084-636-87352061652921/AnsiballZ_command.py'
Jan 27 21:01:33 compute-1 sudo[90862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:33 compute-1 python3.9[90864]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:01:33 compute-1 sudo[90862]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:34 compute-1 sudo[91017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voxnnoxgixnsyfsthymkfwmxszogwrhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547693.7538798-652-210060049890281/AnsiballZ_command.py'
Jan 27 21:01:34 compute-1 sudo[91017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:34 compute-1 python3.9[91019]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:01:34 compute-1 ovs-vsctl[91020]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 27 21:01:34 compute-1 sudo[91017]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:34 compute-1 python3.9[91170]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:01:35 compute-1 sudo[91322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpmdlfeqplxswknhbhdmwztbffbiqrhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547695.3587363-686-131397662184375/AnsiballZ_file.py'
Jan 27 21:01:35 compute-1 sudo[91322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:35 compute-1 python3.9[91324]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:35 compute-1 sudo[91322]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:36 compute-1 sudo[91474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiucvptzixpwzhpygdcypolgfgsowyrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547696.2130735-702-264368845656950/AnsiballZ_stat.py'
Jan 27 21:01:36 compute-1 sudo[91474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:36 compute-1 python3.9[91476]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:36 compute-1 sudo[91474]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:37 compute-1 sudo[91552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyurwrpzkwftxnalvrqiabsfvpmeojbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547696.2130735-702-264368845656950/AnsiballZ_file.py'
Jan 27 21:01:37 compute-1 sudo[91552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:37 compute-1 python3.9[91554]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:37 compute-1 sudo[91552]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:37 compute-1 sudo[91704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpbcywzuvwmefudjnsasuylxizkbawvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547697.3791487-702-280851877624583/AnsiballZ_stat.py'
Jan 27 21:01:37 compute-1 sudo[91704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:37 compute-1 python3.9[91706]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:37 compute-1 sudo[91704]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:38 compute-1 sudo[91782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlxawioeornoutzugxaoqhxiskhqjswo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547697.3791487-702-280851877624583/AnsiballZ_file.py'
Jan 27 21:01:38 compute-1 sudo[91782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:38 compute-1 python3.9[91784]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:38 compute-1 sudo[91782]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:39 compute-1 sudo[91934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfztdtcpepoqwyprnxqyveniplxsdcyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547698.6374245-748-265403107236359/AnsiballZ_file.py'
Jan 27 21:01:39 compute-1 sudo[91934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:39 compute-1 python3.9[91936]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:39 compute-1 sudo[91934]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:39 compute-1 sudo[92086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsflaudjohimxzeytjikurqqvnwtlyoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547699.5522096-764-954922500923/AnsiballZ_stat.py'
Jan 27 21:01:39 compute-1 sudo[92086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:40 compute-1 python3.9[92088]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:40 compute-1 sudo[92086]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:40 compute-1 sudo[92164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nczijdjertwriqnwmuonjqukbwhjvbjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547699.5522096-764-954922500923/AnsiballZ_file.py'
Jan 27 21:01:40 compute-1 sudo[92164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:40 compute-1 python3.9[92166]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:40 compute-1 sudo[92164]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:41 compute-1 sudo[92316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-habjawxwudiynfovomtbogakpudtmpkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547700.7541137-788-80953595826097/AnsiballZ_stat.py'
Jan 27 21:01:41 compute-1 sudo[92316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:41 compute-1 python3.9[92318]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:41 compute-1 sudo[92316]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:41 compute-1 sudo[92394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dymzohtpnjtvcvlamvvxyucwbtwqyxep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547700.7541137-788-80953595826097/AnsiballZ_file.py'
Jan 27 21:01:41 compute-1 sudo[92394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:41 compute-1 python3.9[92396]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:41 compute-1 sudo[92394]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:42 compute-1 sudo[92546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsdscxopctvtjztkbdldelydllgajefp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547702.0205703-812-140710519266263/AnsiballZ_systemd.py'
Jan 27 21:01:42 compute-1 sudo[92546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:42 compute-1 python3.9[92548]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:01:42 compute-1 systemd[1]: Reloading.
Jan 27 21:01:42 compute-1 systemd-sysv-generator[92578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:01:42 compute-1 systemd-rc-local-generator[92574]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:01:42 compute-1 sudo[92546]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:43 compute-1 sudo[92735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndzwgjrblivcqbvgsmwrvotiklwwacqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547703.2973971-828-125094266311974/AnsiballZ_stat.py'
Jan 27 21:01:43 compute-1 sudo[92735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:43 compute-1 python3.9[92737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:43 compute-1 sudo[92735]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:44 compute-1 sudo[92813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppdqihllbpoeoybdspnhnbiebbuwaqce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547703.2973971-828-125094266311974/AnsiballZ_file.py'
Jan 27 21:01:44 compute-1 sudo[92813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:44 compute-1 python3.9[92815]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:44 compute-1 sudo[92813]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:44 compute-1 sudo[92965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xulyhkotbsfxxbkdssbhavijylkvkyvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547704.6949105-852-27047022786797/AnsiballZ_stat.py'
Jan 27 21:01:44 compute-1 sudo[92965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:45 compute-1 python3.9[92967]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:45 compute-1 sudo[92965]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:45 compute-1 sudo[93043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quimkllquskojfjaufwgaxfokrfoddoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547704.6949105-852-27047022786797/AnsiballZ_file.py'
Jan 27 21:01:45 compute-1 sudo[93043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:45 compute-1 python3.9[93045]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:45 compute-1 sudo[93043]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:46 compute-1 sudo[93195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goohydezzzwmaxzfdxnffkqbwlnszowo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547705.922121-876-94960051372726/AnsiballZ_systemd.py'
Jan 27 21:01:46 compute-1 sudo[93195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:46 compute-1 python3.9[93197]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:01:46 compute-1 systemd[1]: Reloading.
Jan 27 21:01:46 compute-1 systemd-rc-local-generator[93224]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:01:46 compute-1 systemd-sysv-generator[93227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:01:46 compute-1 systemd[1]: Starting Create netns directory...
Jan 27 21:01:46 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 27 21:01:46 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 27 21:01:46 compute-1 systemd[1]: Finished Create netns directory.
Jan 27 21:01:46 compute-1 sudo[93195]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:47 compute-1 sudo[93387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnhmwejmsjeslpwkmkjdlqsggzyaccjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547707.2856903-896-152764741566797/AnsiballZ_file.py'
Jan 27 21:01:47 compute-1 sudo[93387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:47 compute-1 python3.9[93389]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:47 compute-1 sudo[93387]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:48 compute-1 sudo[93539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfwhgtvyzwgksapnrlqzacbpponmbgoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547708.3102462-912-32306246560001/AnsiballZ_stat.py'
Jan 27 21:01:48 compute-1 sudo[93539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:48 compute-1 python3.9[93541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:48 compute-1 sudo[93539]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:49 compute-1 sudo[93662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxlrinsrrszvogwlndhkfrvhsljsgrye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547708.3102462-912-32306246560001/AnsiballZ_copy.py'
Jan 27 21:01:49 compute-1 sudo[93662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:49 compute-1 python3.9[93664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547708.3102462-912-32306246560001/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:49 compute-1 sudo[93662]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:50 compute-1 sudo[93814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpnulenkrjkrnplkzshnyvyjjbrwikdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547710.0290687-946-188760452368822/AnsiballZ_file.py'
Jan 27 21:01:50 compute-1 sudo[93814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:50 compute-1 python3.9[93816]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:50 compute-1 sudo[93814]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:51 compute-1 sudo[93966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlzllkqwlupwbbphgczzqgwumansixw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547710.786586-962-96663994458161/AnsiballZ_file.py'
Jan 27 21:01:51 compute-1 sudo[93966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:51 compute-1 python3.9[93968]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:01:51 compute-1 sudo[93966]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:51 compute-1 sudo[94118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntmfwsnyztddznsmdnqcjmpfjycbrtgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547711.6036296-978-257627868753604/AnsiballZ_stat.py'
Jan 27 21:01:51 compute-1 sudo[94118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:52 compute-1 python3.9[94120]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:01:52 compute-1 sudo[94118]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:52 compute-1 sudo[94241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glkufoymrzjgnmgtnqvdbmzaarqywoon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547711.6036296-978-257627868753604/AnsiballZ_copy.py'
Jan 27 21:01:52 compute-1 sudo[94241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:52 compute-1 python3.9[94243]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547711.6036296-978-257627868753604/.source.json _original_basename=.kbpey4ao follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:52 compute-1 sudo[94241]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:53 compute-1 python3.9[94393]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:01:55 compute-1 sudo[94814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnzkgwdaiihzlmsrfomumsztrtllqsmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547715.3284316-1058-403460789277/AnsiballZ_container_config_data.py'
Jan 27 21:01:55 compute-1 sudo[94814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:56 compute-1 python3.9[94816]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 27 21:01:56 compute-1 sudo[94814]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:56 compute-1 sudo[94966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwoaaigiftllswnbddpmdgzobiioblzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547716.4258523-1080-204829529690895/AnsiballZ_container_config_hash.py'
Jan 27 21:01:56 compute-1 sudo[94966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:57 compute-1 python3.9[94968]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 27 21:01:57 compute-1 sudo[94966]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:57 compute-1 sudo[95118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaajkboaxldyaryczrnrdvclvdmgnnap ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769547717.4034233-1100-81513059884915/AnsiballZ_edpm_container_manage.py'
Jan 27 21:01:57 compute-1 sudo[95118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:58 compute-1 python3[95120]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 27 21:01:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 21:01:58 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 21:01:58 compute-1 podman[95156]: 2026-01-27 21:01:58.362693779 +0000 UTC m=+0.047909139 container create 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260126, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 27 21:01:58 compute-1 podman[95156]: 2026-01-27 21:01:58.340047904 +0000 UTC m=+0.025263274 image pull 8cb1c5bd5110b926616067efa18d7f44906d7840bd53541459116085a1a2a2ac 38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 27 21:01:58 compute-1 python3[95120]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Jan 27 21:01:58 compute-1 sudo[95118]: pam_unix(sudo:session): session closed for user root
Jan 27 21:01:59 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 27 21:01:59 compute-1 sudo[95342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egwwltucgvbetleykyndodhgpfwzahop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547719.1159945-1116-10212504279573/AnsiballZ_stat.py'
Jan 27 21:01:59 compute-1 sudo[95342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:01:59 compute-1 python3.9[95344]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:01:59 compute-1 sudo[95342]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:00 compute-1 sudo[95496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dohnrhgwqondcyimmliudapcnfsxyacq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547720.0187974-1134-162240644753439/AnsiballZ_file.py'
Jan 27 21:02:00 compute-1 sudo[95496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:00 compute-1 python3.9[95498]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:00 compute-1 sudo[95496]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:00 compute-1 sudo[95572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbfoetdovszuyqqczzfxfmnmsbknjyfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547720.0187974-1134-162240644753439/AnsiballZ_stat.py'
Jan 27 21:02:00 compute-1 sudo[95572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:00 compute-1 python3.9[95574]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:02:00 compute-1 sudo[95572]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:01 compute-1 sudo[95724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdvjencoprebrrnwwzicrjzzzrmvwivh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547721.0009832-1134-193704705694955/AnsiballZ_copy.py'
Jan 27 21:02:01 compute-1 sudo[95724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:01 compute-1 python3.9[95726]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769547721.0009832-1134-193704705694955/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:01 compute-1 sudo[95724]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:02 compute-1 sudo[95800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttuuedcuazhcpgbmijfkqudowfcwcfks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547721.0009832-1134-193704705694955/AnsiballZ_systemd.py'
Jan 27 21:02:02 compute-1 sudo[95800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:02 compute-1 python3.9[95802]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:02:02 compute-1 systemd[1]: Reloading.
Jan 27 21:02:02 compute-1 systemd-sysv-generator[95831]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:02:02 compute-1 systemd-rc-local-generator[95827]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:02:02 compute-1 sudo[95800]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:02 compute-1 sudo[95911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeawwkcuifravtmrmmwdkitzisahgbqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547721.0009832-1134-193704705694955/AnsiballZ_systemd.py'
Jan 27 21:02:02 compute-1 sudo[95911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:03 compute-1 python3.9[95913]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:02:04 compute-1 systemd[1]: Reloading.
Jan 27 21:02:04 compute-1 systemd-rc-local-generator[95941]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:02:04 compute-1 systemd-sysv-generator[95945]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:02:04 compute-1 systemd[1]: Starting ovn_controller container...
Jan 27 21:02:04 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 27 21:02:04 compute-1 systemd[1]: Started libcrun container.
Jan 27 21:02:04 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45c36cd517bab70329dc82e21f5a69fb7d824d24de8777da0eb48dc543c71f78/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 27 21:02:04 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08.
Jan 27 21:02:04 compute-1 podman[95954]: 2026-01-27 21:02:04.685235279 +0000 UTC m=+0.170630785 container init 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:02:04 compute-1 ovn_controller[95969]: + sudo -E kolla_set_configs
Jan 27 21:02:04 compute-1 podman[95954]: 2026-01-27 21:02:04.713703069 +0000 UTC m=+0.199098595 container start 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 21:02:04 compute-1 edpm-start-podman-container[95954]: ovn_controller
Jan 27 21:02:04 compute-1 systemd[1]: Created slice User Slice of UID 0.
Jan 27 21:02:04 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 27 21:02:04 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 27 21:02:04 compute-1 systemd[1]: Starting User Manager for UID 0...
Jan 27 21:02:04 compute-1 edpm-start-podman-container[95953]: Creating additional drop-in dependency for "ovn_controller" (0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08)
Jan 27 21:02:04 compute-1 systemd[96008]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 27 21:02:04 compute-1 podman[95976]: 2026-01-27 21:02:04.817783843 +0000 UTC m=+0.090143434 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 21:02:04 compute-1 systemd[1]: 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08-49b0129d10de99a.service: Main process exited, code=exited, status=1/FAILURE
Jan 27 21:02:04 compute-1 systemd[1]: 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08-49b0129d10de99a.service: Failed with result 'exit-code'.
Jan 27 21:02:04 compute-1 systemd[1]: Reloading.
Jan 27 21:02:04 compute-1 systemd-rc-local-generator[96057]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:02:04 compute-1 systemd-sysv-generator[96060]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:02:04 compute-1 systemd[96008]: Queued start job for default target Main User Target.
Jan 27 21:02:04 compute-1 systemd[96008]: Created slice User Application Slice.
Jan 27 21:02:04 compute-1 systemd[96008]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 27 21:02:04 compute-1 systemd[96008]: Started Daily Cleanup of User's Temporary Directories.
Jan 27 21:02:04 compute-1 systemd[96008]: Reached target Paths.
Jan 27 21:02:04 compute-1 systemd[96008]: Reached target Timers.
Jan 27 21:02:04 compute-1 systemd[96008]: Starting D-Bus User Message Bus Socket...
Jan 27 21:02:04 compute-1 systemd[96008]: Starting Create User's Volatile Files and Directories...
Jan 27 21:02:04 compute-1 systemd[96008]: Finished Create User's Volatile Files and Directories.
Jan 27 21:02:04 compute-1 systemd[96008]: Listening on D-Bus User Message Bus Socket.
Jan 27 21:02:04 compute-1 systemd[96008]: Reached target Sockets.
Jan 27 21:02:04 compute-1 systemd[96008]: Reached target Basic System.
Jan 27 21:02:04 compute-1 systemd[96008]: Reached target Main User Target.
Jan 27 21:02:04 compute-1 systemd[96008]: Startup finished in 151ms.
Jan 27 21:02:05 compute-1 systemd[1]: Started User Manager for UID 0.
Jan 27 21:02:05 compute-1 systemd[1]: Started ovn_controller container.
Jan 27 21:02:05 compute-1 systemd[1]: Started Session c1 of User root.
Jan 27 21:02:05 compute-1 sudo[95911]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:05 compute-1 ovn_controller[95969]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 27 21:02:05 compute-1 ovn_controller[95969]: INFO:__main__:Validating config file
Jan 27 21:02:05 compute-1 ovn_controller[95969]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 27 21:02:05 compute-1 ovn_controller[95969]: INFO:__main__:Writing out command to execute
Jan 27 21:02:05 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 27 21:02:05 compute-1 ovn_controller[95969]: ++ cat /run_command
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + ARGS=
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + sudo kolla_copy_cacerts
Jan 27 21:02:05 compute-1 systemd[1]: Started Session c2 of User root.
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + [[ ! -n '' ]]
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + . kolla_extend_start
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 27 21:02:05 compute-1 ovn_controller[95969]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + umask 0022
Jan 27 21:02:05 compute-1 ovn_controller[95969]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 27 21:02:05 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Jan 27 21:02:05 compute-1 ovn_controller[95969]: 2026-01-27T21:02:05Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Jan 27 21:02:05 compute-1 NetworkManager[56069]: <info>  [1769547725.1950] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 27 21:02:05 compute-1 NetworkManager[56069]: <info>  [1769547725.1955] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 27 21:02:05 compute-1 NetworkManager[56069]: <warn>  [1769547725.1957] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 27 21:02:05 compute-1 NetworkManager[56069]: <info>  [1769547725.1962] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 27 21:02:05 compute-1 NetworkManager[56069]: <info>  [1769547725.1966] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 27 21:02:05 compute-1 NetworkManager[56069]: <info>  [1769547725.1968] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 27 21:02:05 compute-1 kernel: br-int: entered promiscuous mode
Jan 27 21:02:05 compute-1 systemd-udevd[96106]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 21:02:06 compute-1 python3.9[96234]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00025|main|INFO|OVS feature set changed, force recompute.
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00034|features|INFO|OVS Feature: group_support, state: supported
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00035|main|INFO|OVS feature set changed, force recompute.
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 27 21:02:06 compute-1 ovn_controller[95969]: 2026-01-27T21:02:06Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 27 21:02:06 compute-1 NetworkManager[56069]: <info>  [1769547726.2199] manager: (ovn-05bac7-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 27 21:02:06 compute-1 NetworkManager[56069]: <info>  [1769547726.2213] manager: (ovn-e6126e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Jan 27 21:02:06 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Jan 27 21:02:06 compute-1 systemd-udevd[96108]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 21:02:06 compute-1 NetworkManager[56069]: <info>  [1769547726.2445] device (genev_sys_6081): carrier: link connected
Jan 27 21:02:06 compute-1 NetworkManager[56069]: <info>  [1769547726.2456] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Jan 27 21:02:07 compute-1 sudo[96387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grgjldhwkqwahukdqaqhvyuwpybldrzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547726.7514157-1224-85320562894558/AnsiballZ_stat.py'
Jan 27 21:02:07 compute-1 sudo[96387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:07 compute-1 python3.9[96389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:07 compute-1 sudo[96387]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:07 compute-1 sudo[96510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vthbsvxwwbclpxshhbwyhihsdwjkimci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547726.7514157-1224-85320562894558/AnsiballZ_copy.py'
Jan 27 21:02:07 compute-1 sudo[96510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:07 compute-1 python3.9[96512]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547726.7514157-1224-85320562894558/.source.yaml _original_basename=.skrwr9zm follow=False checksum=2f50479e4734db99c8dfe6971cadf3cb58c5fcfc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:07 compute-1 sudo[96510]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:08 compute-1 sudo[96662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dltolmovdqbnglvzksumdowrlzzdhtth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547728.1034772-1254-3311951307623/AnsiballZ_command.py'
Jan 27 21:02:08 compute-1 sudo[96662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:08 compute-1 python3.9[96664]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:02:08 compute-1 ovs-vsctl[96665]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 27 21:02:08 compute-1 sudo[96662]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:09 compute-1 sudo[96815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxeqyopdeegqvqvbzoszpontjfuyqoru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547729.0621657-1270-12776878418691/AnsiballZ_command.py'
Jan 27 21:02:09 compute-1 sudo[96815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:09 compute-1 python3.9[96817]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:02:09 compute-1 ovs-vsctl[96819]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 27 21:02:09 compute-1 sudo[96815]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:10 compute-1 sudo[96970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hphigmdhvltkxpnhkdfpwzjbmghfpkyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547730.1903763-1298-70522938154532/AnsiballZ_command.py'
Jan 27 21:02:10 compute-1 sudo[96970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:10 compute-1 python3.9[96972]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:02:10 compute-1 ovs-vsctl[96973]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 27 21:02:10 compute-1 sudo[96970]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:11 compute-1 sshd-session[85484]: Connection closed by 192.168.122.30 port 47052
Jan 27 21:02:11 compute-1 sshd-session[85481]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:02:11 compute-1 systemd[1]: session-21.scope: Deactivated successfully.
Jan 27 21:02:11 compute-1 systemd[1]: session-21.scope: Consumed 48.250s CPU time.
Jan 27 21:02:11 compute-1 systemd-logind[786]: Session 21 logged out. Waiting for processes to exit.
Jan 27 21:02:11 compute-1 systemd-logind[786]: Removed session 21.
Jan 27 21:02:15 compute-1 systemd[1]: Stopping User Manager for UID 0...
Jan 27 21:02:15 compute-1 systemd[96008]: Activating special unit Exit the Session...
Jan 27 21:02:15 compute-1 systemd[96008]: Stopped target Main User Target.
Jan 27 21:02:15 compute-1 systemd[96008]: Stopped target Basic System.
Jan 27 21:02:15 compute-1 systemd[96008]: Stopped target Paths.
Jan 27 21:02:15 compute-1 systemd[96008]: Stopped target Sockets.
Jan 27 21:02:15 compute-1 systemd[96008]: Stopped target Timers.
Jan 27 21:02:15 compute-1 systemd[96008]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 27 21:02:15 compute-1 systemd[96008]: Closed D-Bus User Message Bus Socket.
Jan 27 21:02:15 compute-1 systemd[96008]: Stopped Create User's Volatile Files and Directories.
Jan 27 21:02:15 compute-1 systemd[96008]: Removed slice User Application Slice.
Jan 27 21:02:15 compute-1 systemd[96008]: Reached target Shutdown.
Jan 27 21:02:15 compute-1 systemd[96008]: Finished Exit the Session.
Jan 27 21:02:15 compute-1 systemd[96008]: Reached target Exit the Session.
Jan 27 21:02:15 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Jan 27 21:02:15 compute-1 systemd[1]: Stopped User Manager for UID 0.
Jan 27 21:02:15 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 27 21:02:15 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 27 21:02:15 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 27 21:02:15 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 27 21:02:15 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Jan 27 21:02:18 compute-1 sshd-session[97000]: Accepted publickey for zuul from 192.168.122.30 port 58120 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 21:02:18 compute-1 systemd-logind[786]: New session 23 of user zuul.
Jan 27 21:02:18 compute-1 systemd[1]: Started Session 23 of User zuul.
Jan 27 21:02:18 compute-1 sshd-session[97000]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:02:19 compute-1 python3.9[97153]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:02:20 compute-1 sudo[97307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlxpkourmfrkvlqcchaqzasibjfutefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547739.6379607-44-104490470833940/AnsiballZ_file.py'
Jan 27 21:02:20 compute-1 sudo[97307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:20 compute-1 python3.9[97309]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:20 compute-1 sudo[97307]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:20 compute-1 sudo[97459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awbriiedphsiknyfsyjulqoiyeqhtrer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547740.4698403-44-144603495344271/AnsiballZ_file.py'
Jan 27 21:02:20 compute-1 sudo[97459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:20 compute-1 python3.9[97461]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:20 compute-1 sudo[97459]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:21 compute-1 sudo[97611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eochuuuvjvrwmwuyshdjajdvakglgcbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547741.0630887-44-22958448191191/AnsiballZ_file.py'
Jan 27 21:02:21 compute-1 sudo[97611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:21 compute-1 python3.9[97613]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:21 compute-1 sudo[97611]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:22 compute-1 sudo[97763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teglzmiqswhaxyclhjoycjfrnmmllxyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547741.7541578-44-174802952982319/AnsiballZ_file.py'
Jan 27 21:02:22 compute-1 sudo[97763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:22 compute-1 python3.9[97765]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:22 compute-1 sudo[97763]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:22 compute-1 sudo[97915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckpgwunpgxyzggcvvoytgmqkutfhynps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547742.4450343-44-123035665921722/AnsiballZ_file.py'
Jan 27 21:02:22 compute-1 sudo[97915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:23 compute-1 python3.9[97917]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:23 compute-1 sudo[97915]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:24 compute-1 python3.9[98067]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:02:24 compute-1 sudo[98218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uscyuruoqtzpbhyihrwjlsdpshwakipr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547744.2793229-132-31521826932271/AnsiballZ_seboolean.py'
Jan 27 21:02:24 compute-1 sudo[98218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:24 compute-1 python3.9[98220]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 27 21:02:25 compute-1 sudo[98218]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:26 compute-1 python3.9[98370]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:27 compute-1 python3.9[98491]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547745.7680705-148-259322033803814/.source follow=False _original_basename=haproxy.j2 checksum=49138cf053eba954c45f906b24a92d7c634cefdd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:27 compute-1 python3.9[98641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:28 compute-1 python3.9[98762]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547747.318517-178-101382061236759/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:28 compute-1 sudo[98912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhrynxfbuvwaghklgnwcveyuggykivaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547748.6618056-212-218909075780993/AnsiballZ_setup.py'
Jan 27 21:02:28 compute-1 sudo[98912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:29 compute-1 python3.9[98914]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 21:02:29 compute-1 sudo[98912]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:29 compute-1 sudo[98996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tehcedtuqysxbutqarfokugrykmqxtat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547748.6618056-212-218909075780993/AnsiballZ_dnf.py'
Jan 27 21:02:29 compute-1 sudo[98996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:30 compute-1 python3.9[98998]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 21:02:31 compute-1 sudo[98996]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:32 compute-1 sudo[99150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjgzckkpuntepjogbtqelkwioqyqfbhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547752.2049346-236-205494723214686/AnsiballZ_systemd.py'
Jan 27 21:02:32 compute-1 sudo[99150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:33 compute-1 python3.9[99152]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 21:02:33 compute-1 sudo[99150]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:33 compute-1 python3.9[99305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:34 compute-1 python3.9[99426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547753.4827347-252-57148533796894/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:35 compute-1 ovn_controller[95969]: 2026-01-27T21:02:35Z|00038|memory|INFO|15936 kB peak resident set size after 29.9 seconds
Jan 27 21:02:35 compute-1 ovn_controller[95969]: 2026-01-27T21:02:35Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 27 21:02:35 compute-1 podman[99550]: 2026-01-27 21:02:35.121041244 +0000 UTC m=+0.125978205 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 27 21:02:35 compute-1 python3.9[99587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:35 compute-1 python3.9[99721]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547754.6993816-252-47593929077340/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:37 compute-1 python3.9[99871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:37 compute-1 python3.9[99992]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547756.6941338-340-59400580287692/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:38 compute-1 python3.9[100142]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:39 compute-1 python3.9[100263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547758.0966268-340-270616809849679/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:40 compute-1 python3.9[100413]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:02:40 compute-1 sudo[100565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbtxccmnxkhmnfbfrmeaosxovzxpslym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547760.4250188-416-85328960529754/AnsiballZ_file.py'
Jan 27 21:02:40 compute-1 sudo[100565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:40 compute-1 python3.9[100567]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:40 compute-1 sudo[100565]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:41 compute-1 sudo[100717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyasvoctxuluhzltfjqwrvfbltzbzaya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547761.3633115-432-79609079837221/AnsiballZ_stat.py'
Jan 27 21:02:41 compute-1 sudo[100717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:41 compute-1 python3.9[100719]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:41 compute-1 sudo[100717]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:42 compute-1 sudo[100795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nejyfkqrnccstdrwkmdnooazdreapnva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547761.3633115-432-79609079837221/AnsiballZ_file.py'
Jan 27 21:02:42 compute-1 sudo[100795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:42 compute-1 python3.9[100797]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:42 compute-1 sudo[100795]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:42 compute-1 sudo[100947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tggtnuioxttjkhtearllplybvoisvkeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547762.5239253-432-115351199607725/AnsiballZ_stat.py'
Jan 27 21:02:42 compute-1 sudo[100947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:43 compute-1 python3.9[100949]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:43 compute-1 sudo[100947]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:43 compute-1 sudo[101025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsyuunpdyzsuacyokybdniqxnrpkiqis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547762.5239253-432-115351199607725/AnsiballZ_file.py'
Jan 27 21:02:43 compute-1 sudo[101025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:43 compute-1 python3.9[101027]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:43 compute-1 sudo[101025]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:44 compute-1 sudo[101177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcfkaldkgwenbybuxmlncrkjniacuyie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547763.8058014-478-227565600956255/AnsiballZ_file.py'
Jan 27 21:02:44 compute-1 sudo[101177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:44 compute-1 python3.9[101179]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:44 compute-1 sudo[101177]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:45 compute-1 sudo[101329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzrzjremlwcaafqjgljtumeqmxbjrmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547764.6781883-494-211685425317603/AnsiballZ_stat.py'
Jan 27 21:02:45 compute-1 sudo[101329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:45 compute-1 python3.9[101331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:45 compute-1 sudo[101329]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:45 compute-1 sudo[101407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwxkzzdvocmnybyklqowboijfosgjova ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547764.6781883-494-211685425317603/AnsiballZ_file.py'
Jan 27 21:02:45 compute-1 sudo[101407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:45 compute-1 python3.9[101409]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:45 compute-1 sudo[101407]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:46 compute-1 sudo[101559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-venkwggbekmdupoyomuabehfxodxtizp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547766.281221-518-18846698953180/AnsiballZ_stat.py'
Jan 27 21:02:46 compute-1 sudo[101559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:46 compute-1 python3.9[101561]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:46 compute-1 sudo[101559]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:47 compute-1 sudo[101637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nponulphzlljvrkociomfuzsbwanvrhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547766.281221-518-18846698953180/AnsiballZ_file.py'
Jan 27 21:02:47 compute-1 sudo[101637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:47 compute-1 python3.9[101639]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:47 compute-1 sudo[101637]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:47 compute-1 sudo[101789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjqdtsockldqvvnehvjzirfdxyezdps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547767.6410844-542-104958208433083/AnsiballZ_systemd.py'
Jan 27 21:02:47 compute-1 sudo[101789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:48 compute-1 python3.9[101791]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:02:48 compute-1 systemd[1]: Reloading.
Jan 27 21:02:48 compute-1 systemd-rc-local-generator[101821]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:02:48 compute-1 systemd-sysv-generator[101825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:02:48 compute-1 sudo[101789]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:49 compute-1 sudo[101978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suuypofehsficjmprdbglznegxmozpdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547768.9904265-558-255958864051226/AnsiballZ_stat.py'
Jan 27 21:02:49 compute-1 sudo[101978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:49 compute-1 python3.9[101980]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:49 compute-1 sudo[101978]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:49 compute-1 sudo[102056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rarnuxtahyadrwuujbentpnrtqxbtoqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547768.9904265-558-255958864051226/AnsiballZ_file.py'
Jan 27 21:02:49 compute-1 sudo[102056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:49 compute-1 python3.9[102058]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:49 compute-1 sudo[102056]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:50 compute-1 sudo[102208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trdvlyhstgxwxhknyfomqboqalnqikze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547770.2263937-582-152162444991600/AnsiballZ_stat.py'
Jan 27 21:02:50 compute-1 sudo[102208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:50 compute-1 python3.9[102210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:50 compute-1 sudo[102208]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:50 compute-1 sudo[102286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niehbkhfntfdaounriwcygumysmlxtve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547770.2263937-582-152162444991600/AnsiballZ_file.py'
Jan 27 21:02:50 compute-1 sudo[102286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:51 compute-1 python3.9[102288]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:51 compute-1 sudo[102286]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:51 compute-1 sudo[102438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzcjtgpkfwhpxpqnxeugxmifmehkfxfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547771.572346-606-7971143373112/AnsiballZ_systemd.py'
Jan 27 21:02:51 compute-1 sudo[102438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:52 compute-1 python3.9[102440]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:02:52 compute-1 systemd[1]: Reloading.
Jan 27 21:02:52 compute-1 systemd-sysv-generator[102475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:02:52 compute-1 systemd-rc-local-generator[102471]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:02:52 compute-1 systemd[1]: Starting Create netns directory...
Jan 27 21:02:52 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 27 21:02:52 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 27 21:02:52 compute-1 systemd[1]: Finished Create netns directory.
Jan 27 21:02:52 compute-1 sudo[102438]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:53 compute-1 sudo[102633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfirbphuxbmgfgaamjnxdjeujqbhbpyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547773.4232228-626-33976521230413/AnsiballZ_file.py'
Jan 27 21:02:53 compute-1 sudo[102633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:53 compute-1 python3.9[102635]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:53 compute-1 sudo[102633]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:54 compute-1 sudo[102785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gofitbouljuanoffbbezrmjkmqfujqtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547774.1519227-642-97981882210713/AnsiballZ_stat.py'
Jan 27 21:02:54 compute-1 sudo[102785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:54 compute-1 python3.9[102787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:54 compute-1 sudo[102785]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:55 compute-1 sudo[102908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvbyzgoorkummamyifxnubdwnmnfnyba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547774.1519227-642-97981882210713/AnsiballZ_copy.py'
Jan 27 21:02:55 compute-1 sudo[102908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:55 compute-1 python3.9[102910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769547774.1519227-642-97981882210713/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:55 compute-1 sudo[102908]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:55 compute-1 sudo[103060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxirrwipagybtxlxtlzmxkhxmzjwyuom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547775.6659925-676-7857081508904/AnsiballZ_file.py'
Jan 27 21:02:55 compute-1 sudo[103060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:56 compute-1 python3.9[103062]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:56 compute-1 sudo[103060]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:56 compute-1 sudo[103212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxcsfvvvmpbcomgoeeipvwldqeucrfne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547776.5250678-692-150462593035277/AnsiballZ_file.py'
Jan 27 21:02:56 compute-1 sudo[103212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:57 compute-1 python3.9[103214]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:02:57 compute-1 sudo[103212]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:57 compute-1 sudo[103364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxlcvbohzakssbqugbqerjstmqilbbou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547777.2686853-708-20374023998548/AnsiballZ_stat.py'
Jan 27 21:02:57 compute-1 sudo[103364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:57 compute-1 python3.9[103366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:02:57 compute-1 sudo[103364]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:58 compute-1 sudo[103487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsukzhhryozeulhxicgndvawucjsosvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547777.2686853-708-20374023998548/AnsiballZ_copy.py'
Jan 27 21:02:58 compute-1 sudo[103487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:02:58 compute-1 python3.9[103489]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547777.2686853-708-20374023998548/.source.json _original_basename=.lv8b8rxe follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:02:58 compute-1 sudo[103487]: pam_unix(sudo:session): session closed for user root
Jan 27 21:02:59 compute-1 python3.9[103639]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:01 compute-1 sudo[104060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woezcfwjgaokcbinekafgdacxsejiqwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547780.8020759-788-249295097677900/AnsiballZ_container_config_data.py'
Jan 27 21:03:01 compute-1 sudo[104060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:01 compute-1 python3.9[104062]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 27 21:03:01 compute-1 sudo[104060]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:02 compute-1 sudo[104212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujpjireeefnqtaksbwgewotqlcsunidr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547781.90557-810-198646709853630/AnsiballZ_container_config_hash.py'
Jan 27 21:03:02 compute-1 sudo[104212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:02 compute-1 python3.9[104214]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 27 21:03:02 compute-1 sudo[104212]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:03 compute-1 sudo[104364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txtpgeijsuptjluagpwtsdcugspfdmqt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769547782.892026-830-106167338886247/AnsiballZ_edpm_container_manage.py'
Jan 27 21:03:03 compute-1 sudo[104364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:03 compute-1 python3[104366]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 27 21:03:03 compute-1 podman[104401]: 2026-01-27 21:03:03.865250435 +0000 UTC m=+0.055792532 container create 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Jan 27 21:03:03 compute-1 podman[104401]: 2026-01-27 21:03:03.836511452 +0000 UTC m=+0.027053559 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 21:03:03 compute-1 python3[104366]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 21:03:04 compute-1 sudo[104364]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:04 compute-1 sudo[104589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifyvguxlqhhmpbwjmdczmoefupozgunv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547784.5671813-846-192612506769791/AnsiballZ_stat.py'
Jan 27 21:03:04 compute-1 sudo[104589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:05 compute-1 python3.9[104591]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:03:05 compute-1 sudo[104589]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:05 compute-1 sudo[104758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwglpzvuzriejpaapgyjcrpsttacaogf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547785.4106038-864-128731716171149/AnsiballZ_file.py'
Jan 27 21:03:05 compute-1 sudo[104758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:05 compute-1 podman[104717]: 2026-01-27 21:03:05.73230754 +0000 UTC m=+0.091661418 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 27 21:03:05 compute-1 python3.9[104764]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:05 compute-1 sudo[104758]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:06 compute-1 sudo[104846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aplyuezvemumcwffkubnlnqnzgsxpwmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547785.4106038-864-128731716171149/AnsiballZ_stat.py'
Jan 27 21:03:06 compute-1 sudo[104846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:06 compute-1 python3.9[104848]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:03:06 compute-1 sudo[104846]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:06 compute-1 sudo[104997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxfmpdpxhkhoxreowoknyljxnxdexjvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547786.4798608-864-153071709587169/AnsiballZ_copy.py'
Jan 27 21:03:06 compute-1 sudo[104997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:07 compute-1 python3.9[104999]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769547786.4798608-864-153071709587169/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:07 compute-1 sudo[104997]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:07 compute-1 sudo[105073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqoxuevyhtjxzcfqtbyhlpicshrhcgrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547786.4798608-864-153071709587169/AnsiballZ_systemd.py'
Jan 27 21:03:07 compute-1 sudo[105073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:07 compute-1 python3.9[105075]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:03:07 compute-1 systemd[1]: Reloading.
Jan 27 21:03:07 compute-1 systemd-sysv-generator[105102]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:03:07 compute-1 systemd-rc-local-generator[105098]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:03:07 compute-1 sudo[105073]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:08 compute-1 sudo[105184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgqwwxbkdlnronqiugpfkrcjnszefblp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547786.4798608-864-153071709587169/AnsiballZ_systemd.py'
Jan 27 21:03:08 compute-1 sudo[105184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:08 compute-1 python3.9[105186]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:08 compute-1 systemd[1]: Reloading.
Jan 27 21:03:08 compute-1 systemd-rc-local-generator[105215]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:03:08 compute-1 systemd-sysv-generator[105219]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:03:08 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Jan 27 21:03:09 compute-1 systemd[1]: Started libcrun container.
Jan 27 21:03:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13f618f3cc8da3b46ce6cde002eb75e08b6ab33e876ee1de1d72ae090f3e8d3d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 27 21:03:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13f618f3cc8da3b46ce6cde002eb75e08b6ab33e876ee1de1d72ae090f3e8d3d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 21:03:09 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d.
Jan 27 21:03:09 compute-1 podman[105226]: 2026-01-27 21:03:09.151446816 +0000 UTC m=+0.160629771 container init 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + sudo -E kolla_set_configs
Jan 27 21:03:09 compute-1 podman[105226]: 2026-01-27 21:03:09.178921123 +0000 UTC m=+0.188104068 container start 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 27 21:03:09 compute-1 edpm-start-podman-container[105226]: ovn_metadata_agent
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Validating config file
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Copying service configuration files
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Writing out command to execute
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 27 21:03:09 compute-1 edpm-start-podman-container[105225]: Creating additional drop-in dependency for "ovn_metadata_agent" (6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d)
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: ++ cat /run_command
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + CMD=neutron-ovn-metadata-agent
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + ARGS=
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + sudo kolla_copy_cacerts
Jan 27 21:03:09 compute-1 podman[105249]: 2026-01-27 21:03:09.259563704 +0000 UTC m=+0.060140275 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS)
Jan 27 21:03:09 compute-1 systemd[1]: Reloading.
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + [[ ! -n '' ]]
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + . kolla_extend_start
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: Running command: 'neutron-ovn-metadata-agent'
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + umask 0022
Jan 27 21:03:09 compute-1 ovn_metadata_agent[105242]: + exec neutron-ovn-metadata-agent
Jan 27 21:03:09 compute-1 systemd-sysv-generator[105317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:03:09 compute-1 systemd-rc-local-generator[105314]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:03:09 compute-1 systemd[1]: Started ovn_metadata_agent container.
Jan 27 21:03:09 compute-1 sudo[105184]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:10 compute-1 python3.9[105477]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.087 105247 INFO neutron.common.config [-] Logging enabled!
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.089 105247 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.089 105247 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.089 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.090 105247 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.090 105247 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.090 105247 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.090 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.090 105247 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.090 105247 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.090 105247 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.091 105247 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.091 105247 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.091 105247 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.091 105247 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.091 105247 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.091 105247 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.091 105247 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.092 105247 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.092 105247 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.092 105247 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.092 105247 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.092 105247 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.092 105247 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.092 105247 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.093 105247 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.093 105247 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.093 105247 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.093 105247 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.093 105247 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.093 105247 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.093 105247 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.094 105247 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.094 105247 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.094 105247 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.094 105247 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.094 105247 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.094 105247 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.094 105247 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.095 105247 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.095 105247 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.095 105247 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.095 105247 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.095 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.095 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.095 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.096 105247 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.097 105247 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.098 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.099 105247 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.110 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.099 105247 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.099 105247 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.099 105247 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.099 105247 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.099 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.099 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.100 105247 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.101 105247 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.102 105247 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.103 105247 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.104 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.105 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.106 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.107 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.108 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.109 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.110 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.111 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.112 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.113 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.114 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.115 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.116 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.117 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.117 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.117 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.117 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.117 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.117 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.117 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.118 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.119 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.120 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.121 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.122 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.123 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.124 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.125 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.126 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.127 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.128 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.129 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.130 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.131 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.132 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.132 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.132 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.132 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.132 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.132 105247 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.132 105247 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.182 105247 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.182 105247 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.182 105247 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.183 105247 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.183 105247 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.192 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name af804609-b297-47b2-80af-51c874daa876 (UUID: af804609-b297-47b2-80af-51c874daa876) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.217 105247 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.217 105247 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.217 105247 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.217 105247 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.217 105247 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.220 105247 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.225 105247 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.231 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'af804609-b297-47b2-80af-51c874daa876'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], external_ids={}, name=af804609-b297-47b2-80af-51c874daa876, nb_cfg_timestamp=1769547734214, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.233 105247 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpkcv0j9r1/privsep.sock']
Jan 27 21:03:11 compute-1 sudo[105638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijpzogjndnymngmeikeuscuopjqfhvqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547791.1297007-954-202635091660424/AnsiballZ_stat.py'
Jan 27 21:03:11 compute-1 sudo[105638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:11 compute-1 python3.9[105640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:03:11 compute-1 sudo[105638]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:11 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.947 105247 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.948 105247 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkcv0j9r1/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.824 105687 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.827 105687 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.829 105687 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.829 105687 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105687
Jan 27 21:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:11.949 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[ea742fb9-bfd2-4814-a062-956a99ab0993]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 21:03:12 compute-1 sudo[105769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vffjwthjgiyjvfehljlqufnnznsbasvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547791.1297007-954-202635091660424/AnsiballZ_copy.py'
Jan 27 21:03:12 compute-1 sudo[105769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:12 compute-1 python3.9[105771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769547791.1297007-954-202635091660424/.source.yaml _original_basename=.lpr218e7 follow=False checksum=65c86d30cc4d25e4aa89945c623df99f93e6d212 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:12 compute-1 sudo[105769]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.382 105687 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.382 105687 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.382 105687 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.830 105687 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.836 105687 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 27 21:03:12 compute-1 sshd-session[97003]: Connection closed by 192.168.122.30 port 58120
Jan 27 21:03:12 compute-1 sshd-session[97000]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.876 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[29a1217f-a27f-47d7-b3a4-e321833b7115]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 21:03:12 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.878 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, column=external_ids, values=({'neutron:ovn-metadata-id': 'e12fc867-82b8-544f-9d84-fec344001ace'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 21:03:12 compute-1 systemd[1]: session-23.scope: Consumed 38.158s CPU time.
Jan 27 21:03:12 compute-1 systemd-logind[786]: Session 23 logged out. Waiting for processes to exit.
Jan 27 21:03:12 compute-1 systemd-logind[786]: Removed session 23.
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.884 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 21:03:12 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:03:12.889 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 21:03:19 compute-1 sshd-session[105796]: Accepted publickey for zuul from 192.168.122.30 port 42264 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 21:03:19 compute-1 systemd-logind[786]: New session 24 of user zuul.
Jan 27 21:03:19 compute-1 systemd[1]: Started Session 24 of User zuul.
Jan 27 21:03:19 compute-1 sshd-session[105796]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:03:20 compute-1 python3.9[105949]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:03:20 compute-1 sudo[106103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gszodazcqhdsebjeokygvxxwfeesdquc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547800.5465899-44-38847939797109/AnsiballZ_command.py'
Jan 27 21:03:20 compute-1 sudo[106103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:21 compute-1 python3.9[106105]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:21 compute-1 sudo[106103]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:22 compute-1 sudo[106268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfqcfgurmdihyuuzzuolfzmchuhyqzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547801.5760605-66-62918910034454/AnsiballZ_systemd_service.py'
Jan 27 21:03:22 compute-1 sudo[106268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:22 compute-1 python3.9[106270]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:03:22 compute-1 systemd[1]: Reloading.
Jan 27 21:03:22 compute-1 systemd-sysv-generator[106302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:03:22 compute-1 systemd-rc-local-generator[106299]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:03:22 compute-1 sudo[106268]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:23 compute-1 python3.9[106456]: ansible-ansible.builtin.service_facts Invoked
Jan 27 21:03:23 compute-1 network[106473]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 27 21:03:23 compute-1 network[106474]: 'network-scripts' will be removed from distribution in near future.
Jan 27 21:03:23 compute-1 network[106475]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 27 21:03:27 compute-1 sudo[106734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynvowsiawgqkbrnfkthcvxtnqgmczona ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547806.7063026-104-2452650034335/AnsiballZ_systemd_service.py'
Jan 27 21:03:27 compute-1 sudo[106734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:27 compute-1 python3.9[106736]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:27 compute-1 sudo[106734]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:27 compute-1 sudo[106887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvdxlcpkpuilyokeqkdnwezywahebibs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547807.5069318-104-196151493508587/AnsiballZ_systemd_service.py'
Jan 27 21:03:27 compute-1 sudo[106887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:28 compute-1 python3.9[106889]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:28 compute-1 sudo[106887]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:28 compute-1 sudo[107040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqrqlpbphchkulagrkunrrmzjlhdxxkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547808.4273632-104-273415751675089/AnsiballZ_systemd_service.py'
Jan 27 21:03:28 compute-1 sudo[107040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:29 compute-1 python3.9[107042]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:29 compute-1 sudo[107040]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:29 compute-1 sudo[107193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gilptltpjhfjjlwnxpgawcomhfuvmlvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547809.2919195-104-181277870135130/AnsiballZ_systemd_service.py'
Jan 27 21:03:29 compute-1 sudo[107193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:29 compute-1 python3.9[107195]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:29 compute-1 sudo[107193]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:30 compute-1 sudo[107346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sceapwayalmmhkwfzeakesfrufzrtldr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547810.116797-104-258633300009922/AnsiballZ_systemd_service.py'
Jan 27 21:03:30 compute-1 sudo[107346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:30 compute-1 python3.9[107348]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:30 compute-1 sudo[107346]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:31 compute-1 sudo[107499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhkabomzbdvpmdncnedfrxtkelxdwfvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547810.9130177-104-132797707795701/AnsiballZ_systemd_service.py'
Jan 27 21:03:31 compute-1 sudo[107499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:31 compute-1 python3.9[107501]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:31 compute-1 sudo[107499]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:32 compute-1 sudo[107652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elzefdycnryotpcevyklzvtpuyjwwetq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547811.754169-104-194888306829943/AnsiballZ_systemd_service.py'
Jan 27 21:03:32 compute-1 sudo[107652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:32 compute-1 python3.9[107654]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:03:32 compute-1 sudo[107652]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:33 compute-1 sudo[107805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnbmgabxtouwjqsohwfkclkucvoalojg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547812.7634602-208-18944242310335/AnsiballZ_file.py'
Jan 27 21:03:33 compute-1 sudo[107805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:33 compute-1 python3.9[107807]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:33 compute-1 sudo[107805]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:33 compute-1 sudo[107957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkuxvroneoiscmjkocraqytctbgsmlpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547813.4834645-208-128245217603420/AnsiballZ_file.py'
Jan 27 21:03:33 compute-1 sudo[107957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:33 compute-1 python3.9[107959]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:34 compute-1 sudo[107957]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:34 compute-1 sudo[108109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgxezmriojisnwxnivvhwgapxkowmgjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547814.1687398-208-260654993774793/AnsiballZ_file.py'
Jan 27 21:03:34 compute-1 sudo[108109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:34 compute-1 python3.9[108111]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:34 compute-1 sudo[108109]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:35 compute-1 sudo[108261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsfudvueafuzuxvsyxopidpfjyhmency ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547814.822811-208-191381408277949/AnsiballZ_file.py'
Jan 27 21:03:35 compute-1 sudo[108261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:35 compute-1 python3.9[108263]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:35 compute-1 sudo[108261]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:35 compute-1 sudo[108413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwkwgweheiykgqiltkmularfhbtbvszt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547815.4785852-208-249108450099115/AnsiballZ_file.py'
Jan 27 21:03:35 compute-1 sudo[108413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:35 compute-1 podman[108415]: 2026-01-27 21:03:35.974352869 +0000 UTC m=+0.152388715 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:03:36 compute-1 python3.9[108416]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:36 compute-1 sudo[108413]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:36 compute-1 sudo[108592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsuqtfnbqbxzkxohretbkuygoxbevwas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547816.2004673-208-85802917380344/AnsiballZ_file.py'
Jan 27 21:03:36 compute-1 sudo[108592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:36 compute-1 python3.9[108594]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:36 compute-1 sudo[108592]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:37 compute-1 sudo[108744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agcivflymowlfcuonohnyyikknhrgnua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547816.9760177-208-196125301400773/AnsiballZ_file.py'
Jan 27 21:03:37 compute-1 sudo[108744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:37 compute-1 python3.9[108746]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:37 compute-1 sudo[108744]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:37 compute-1 sudo[108896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfnsvcgmgahphfkmggzniuvosifdfqum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547817.658748-308-22936204436165/AnsiballZ_file.py'
Jan 27 21:03:37 compute-1 sudo[108896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:38 compute-1 python3.9[108898]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:38 compute-1 sudo[108896]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:38 compute-1 sudo[109048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mogzudfunzbvzzejzjrioqaibmigzakt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547818.3508282-308-27051112954336/AnsiballZ_file.py'
Jan 27 21:03:38 compute-1 sudo[109048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:38 compute-1 python3.9[109050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:38 compute-1 sudo[109048]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:39 compute-1 sudo[109200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ferewneibolqtaurmtlgvfqkqxnzlvom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547819.0326982-308-226291454913112/AnsiballZ_file.py'
Jan 27 21:03:39 compute-1 sudo[109200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:39 compute-1 python3.9[109202]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:39 compute-1 sudo[109200]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:39 compute-1 podman[109250]: 2026-01-27 21:03:39.741739299 +0000 UTC m=+0.050342685 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 27 21:03:39 compute-1 sudo[109371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rckfsyzgwuteekdckzxpssjhumqqphkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547819.669833-308-6990713288258/AnsiballZ_file.py'
Jan 27 21:03:39 compute-1 sudo[109371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:40 compute-1 python3.9[109373]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:40 compute-1 sudo[109371]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:40 compute-1 sudo[109523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfqidpxvcombofawyadzvlvobwmllnze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547820.3187215-308-221158498514658/AnsiballZ_file.py'
Jan 27 21:03:40 compute-1 sudo[109523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:40 compute-1 python3.9[109525]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:40 compute-1 sudo[109523]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:41 compute-1 sudo[109675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxabkuodbiblmrpencwpzfofxnwmuify ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547820.9008486-308-120486819418359/AnsiballZ_file.py'
Jan 27 21:03:41 compute-1 sudo[109675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:41 compute-1 python3.9[109677]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:41 compute-1 sudo[109675]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:41 compute-1 sudo[109827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovdvhaltyufygdjqwbkmjqjhfdftraan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547821.5527756-308-225042083711331/AnsiballZ_file.py'
Jan 27 21:03:41 compute-1 sudo[109827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:42 compute-1 python3.9[109829]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:03:42 compute-1 sudo[109827]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:42 compute-1 sudo[109979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfmmlhpchsdkruzuxrspsdepfvnorept ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547822.5159209-410-132560818535828/AnsiballZ_command.py'
Jan 27 21:03:42 compute-1 sudo[109979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:43 compute-1 python3.9[109981]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:43 compute-1 sudo[109979]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:43 compute-1 python3.9[110133]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 27 21:03:44 compute-1 sudo[110283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oauqquhvdrmeyjlunjzpugknjraqgoyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547824.1237264-446-34873853989473/AnsiballZ_systemd_service.py'
Jan 27 21:03:44 compute-1 sudo[110283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:44 compute-1 python3.9[110285]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:03:44 compute-1 systemd[1]: Reloading.
Jan 27 21:03:44 compute-1 systemd-rc-local-generator[110304]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:03:44 compute-1 systemd-sysv-generator[110309]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:03:45 compute-1 sudo[110283]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:45 compute-1 sudo[110471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkfrezehxldqlfrbjtlumgbgzcbnalfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547825.2151916-462-54487300392171/AnsiballZ_command.py'
Jan 27 21:03:45 compute-1 sudo[110471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:45 compute-1 python3.9[110473]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:45 compute-1 sudo[110471]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:46 compute-1 sudo[110624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqswyevdpgkqrgxkkspusgdnqjmnwabw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547825.9868042-462-44750583273081/AnsiballZ_command.py'
Jan 27 21:03:46 compute-1 sudo[110624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:46 compute-1 python3.9[110626]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:46 compute-1 sudo[110624]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:47 compute-1 sudo[110777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpmubchkesdbfrjeykdzxjutfllwtnlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547826.733966-462-273468876447701/AnsiballZ_command.py'
Jan 27 21:03:47 compute-1 sudo[110777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:47 compute-1 python3.9[110779]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:47 compute-1 sudo[110777]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:47 compute-1 sudo[110930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjsfkinswypwddlormydtdkjhiwojlrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547827.4220378-462-25778882400151/AnsiballZ_command.py'
Jan 27 21:03:47 compute-1 sudo[110930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:47 compute-1 python3.9[110932]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:47 compute-1 sudo[110930]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:48 compute-1 sudo[111083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayemzdaynddnlypmushwotnxtfdjwjdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547828.0805902-462-78139734176879/AnsiballZ_command.py'
Jan 27 21:03:48 compute-1 sudo[111083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:48 compute-1 python3.9[111085]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:48 compute-1 sudo[111083]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:49 compute-1 sudo[111236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovapegsstngiedtarhfimfmmtjdtxsfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547828.7962945-462-121525653159323/AnsiballZ_command.py'
Jan 27 21:03:49 compute-1 sudo[111236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:49 compute-1 python3.9[111238]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:49 compute-1 sudo[111236]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:49 compute-1 sudo[111389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqdnfiwhhurkltuhxmiddatuasrkbpua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547829.489846-462-191331140227644/AnsiballZ_command.py'
Jan 27 21:03:49 compute-1 sudo[111389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:50 compute-1 python3.9[111391]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:03:50 compute-1 sudo[111389]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:51 compute-1 sudo[111542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytbeecsalurmycnzhkhxmoisganryupr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547830.8387551-570-53014644930232/AnsiballZ_getent.py'
Jan 27 21:03:51 compute-1 sudo[111542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:51 compute-1 python3.9[111544]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 27 21:03:51 compute-1 sudo[111542]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:52 compute-1 sudo[111695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pifostftosdkubswszwlhlfqeimdtrah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547831.8792043-586-137785015519962/AnsiballZ_group.py'
Jan 27 21:03:52 compute-1 sudo[111695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:52 compute-1 python3.9[111697]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 27 21:03:52 compute-1 groupadd[111698]: group added to /etc/group: name=libvirt, GID=42473
Jan 27 21:03:52 compute-1 groupadd[111698]: group added to /etc/gshadow: name=libvirt
Jan 27 21:03:52 compute-1 groupadd[111698]: new group: name=libvirt, GID=42473
Jan 27 21:03:52 compute-1 sudo[111695]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:53 compute-1 sudo[111853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbwdvyecjuhvidnrmfyndiyduacdivxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547832.8666263-602-62798675831398/AnsiballZ_user.py'
Jan 27 21:03:53 compute-1 sudo[111853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:53 compute-1 python3.9[111855]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 27 21:03:53 compute-1 useradd[111857]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 27 21:03:53 compute-1 sudo[111853]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:54 compute-1 sudo[112013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnrtgqmjmaeonrzgtkwzydoywwdobzko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547834.157519-624-41284312559473/AnsiballZ_setup.py'
Jan 27 21:03:54 compute-1 sudo[112013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:54 compute-1 python3.9[112015]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 21:03:55 compute-1 sudo[112013]: pam_unix(sudo:session): session closed for user root
Jan 27 21:03:55 compute-1 sudo[112097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zigzorwoaurkgixpjtfcsdtplnsokeus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547834.157519-624-41284312559473/AnsiballZ_dnf.py'
Jan 27 21:03:55 compute-1 sudo[112097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:03:55 compute-1 python3.9[112099]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 21:04:06 compute-1 podman[112132]: 2026-01-27 21:04:06.817948524 +0000 UTC m=+0.121373556 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:04:10 compute-1 podman[112172]: 2026-01-27 21:04:10.753614562 +0000 UTC m=+0.063215516 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Jan 27 21:04:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:04:11.134 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:04:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:04:11.134 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:04:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:04:11.134 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:04:22 compute-1 sshd-session[112367]: Invalid user solana from 80.94.92.186 port 34088
Jan 27 21:04:22 compute-1 sshd-session[112367]: Connection closed by invalid user solana 80.94.92.186 port 34088 [preauth]
Jan 27 21:04:33 compute-1 kernel: SELinux:  Converting 2764 SID table entries...
Jan 27 21:04:33 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 21:04:33 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 27 21:04:33 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 21:04:33 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 27 21:04:33 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 21:04:33 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 21:04:33 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 21:04:37 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 27 21:04:37 compute-1 podman[112383]: 2026-01-27 21:04:37.847381151 +0000 UTC m=+0.141658701 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 27 21:04:41 compute-1 podman[112410]: 2026-01-27 21:04:41.762955429 +0000 UTC m=+0.072140576 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 21:04:43 compute-1 kernel: SELinux:  Converting 2764 SID table entries...
Jan 27 21:04:43 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 21:04:43 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 27 21:04:43 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 21:04:43 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 27 21:04:43 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 21:04:43 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 21:04:43 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 21:05:08 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 27 21:05:08 compute-1 podman[120272]: 2026-01-27 21:05:08.808572848 +0000 UTC m=+0.110869641 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:05:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:05:11.135 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:05:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:05:11.135 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:05:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:05:11.136 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:05:12 compute-1 podman[122506]: 2026-01-27 21:05:12.765706912 +0000 UTC m=+0.074595648 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 27 21:05:36 compute-1 kernel: SELinux:  Converting 2765 SID table entries...
Jan 27 21:05:36 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 27 21:05:36 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 27 21:05:36 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 27 21:05:36 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 27 21:05:36 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 27 21:05:36 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 27 21:05:36 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 27 21:05:37 compute-1 groupadd[129364]: group added to /etc/group: name=dnsmasq, GID=993
Jan 27 21:05:37 compute-1 groupadd[129364]: group added to /etc/gshadow: name=dnsmasq
Jan 27 21:05:37 compute-1 groupadd[129364]: new group: name=dnsmasq, GID=993
Jan 27 21:05:37 compute-1 useradd[129371]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 27 21:05:37 compute-1 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Jan 27 21:05:37 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 27 21:05:37 compute-1 dbus-broker-launch[758]: Noticed file-system modification, trigger reload.
Jan 27 21:05:38 compute-1 groupadd[129384]: group added to /etc/group: name=clevis, GID=992
Jan 27 21:05:38 compute-1 groupadd[129384]: group added to /etc/gshadow: name=clevis
Jan 27 21:05:38 compute-1 groupadd[129384]: new group: name=clevis, GID=992
Jan 27 21:05:38 compute-1 useradd[129392]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 27 21:05:38 compute-1 usermod[129420]: add 'clevis' to group 'tss'
Jan 27 21:05:38 compute-1 usermod[129420]: add 'clevis' to shadow group 'tss'
Jan 27 21:05:38 compute-1 podman[129387]: 2026-01-27 21:05:38.963026858 +0000 UTC m=+0.109218571 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:05:41 compute-1 polkitd[44193]: Reloading rules
Jan 27 21:05:41 compute-1 polkitd[44193]: Collecting garbage unconditionally...
Jan 27 21:05:41 compute-1 polkitd[44193]: Loading rules from directory /etc/polkit-1/rules.d
Jan 27 21:05:41 compute-1 polkitd[44193]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 27 21:05:41 compute-1 polkitd[44193]: Finished loading, compiling and executing 3 rules
Jan 27 21:05:41 compute-1 polkitd[44193]: Reloading rules
Jan 27 21:05:41 compute-1 polkitd[44193]: Collecting garbage unconditionally...
Jan 27 21:05:41 compute-1 polkitd[44193]: Loading rules from directory /etc/polkit-1/rules.d
Jan 27 21:05:41 compute-1 polkitd[44193]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 27 21:05:41 compute-1 polkitd[44193]: Finished loading, compiling and executing 3 rules
Jan 27 21:05:42 compute-1 groupadd[129617]: group added to /etc/group: name=ceph, GID=167
Jan 27 21:05:42 compute-1 groupadd[129617]: group added to /etc/gshadow: name=ceph
Jan 27 21:05:42 compute-1 groupadd[129617]: new group: name=ceph, GID=167
Jan 27 21:05:42 compute-1 useradd[129623]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 27 21:05:43 compute-1 podman[129630]: 2026-01-27 21:05:43.785428379 +0000 UTC m=+0.094230841 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:05:45 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Jan 27 21:05:45 compute-1 sshd[1007]: Received signal 15; terminating.
Jan 27 21:05:45 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Jan 27 21:05:45 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Jan 27 21:05:45 compute-1 systemd[1]: sshd.service: Consumed 5.610s CPU time, read 32.0K from disk, written 40.0K to disk.
Jan 27 21:05:45 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Jan 27 21:05:45 compute-1 systemd[1]: Stopping sshd-keygen.target...
Jan 27 21:05:45 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 27 21:05:45 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 27 21:05:45 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 27 21:05:45 compute-1 systemd[1]: Reached target sshd-keygen.target.
Jan 27 21:05:45 compute-1 systemd[1]: Starting OpenSSH server daemon...
Jan 27 21:05:45 compute-1 sshd[130160]: Server listening on 0.0.0.0 port 22.
Jan 27 21:05:45 compute-1 sshd[130160]: Server listening on :: port 22.
Jan 27 21:05:45 compute-1 systemd[1]: Started OpenSSH server daemon.
Jan 27 21:05:47 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 21:05:48 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 21:05:48 compute-1 systemd[1]: Reloading.
Jan 27 21:05:48 compute-1 systemd-rc-local-generator[130419]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:05:48 compute-1 systemd-sysv-generator[130422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:05:48 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 21:05:50 compute-1 sudo[112097]: pam_unix(sudo:session): session closed for user root
Jan 27 21:05:56 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 21:05:56 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 21:05:56 compute-1 systemd[1]: man-db-cache-update.service: Consumed 10.507s CPU time.
Jan 27 21:05:56 compute-1 systemd[1]: run-r78a6b758d3474f37af1460d586531311.service: Deactivated successfully.
Jan 27 21:05:56 compute-1 sudo[138944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usdmkqtthrffhzirztigijbumhysesns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547955.824394-648-193763926402197/AnsiballZ_systemd.py'
Jan 27 21:05:56 compute-1 sudo[138944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:05:56 compute-1 python3.9[138946]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 21:05:56 compute-1 systemd[1]: Reloading.
Jan 27 21:05:56 compute-1 systemd-rc-local-generator[138976]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:05:56 compute-1 systemd-sysv-generator[138979]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:05:57 compute-1 sudo[138944]: pam_unix(sudo:session): session closed for user root
Jan 27 21:05:57 compute-1 sudo[139134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlaetafzfvnfedgfkidtkgxrmradvkpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547957.2302814-648-201194639729582/AnsiballZ_systemd.py'
Jan 27 21:05:57 compute-1 sudo[139134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:05:57 compute-1 python3.9[139136]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 21:05:57 compute-1 systemd[1]: Reloading.
Jan 27 21:05:57 compute-1 systemd-rc-local-generator[139165]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:05:57 compute-1 systemd-sysv-generator[139169]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:05:58 compute-1 sudo[139134]: pam_unix(sudo:session): session closed for user root
Jan 27 21:05:58 compute-1 sudo[139324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvtfoelirhsiuzwfayruarpelvffyvft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547958.3583584-648-237639207822440/AnsiballZ_systemd.py'
Jan 27 21:05:58 compute-1 sudo[139324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:05:58 compute-1 python3.9[139326]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 21:05:59 compute-1 systemd[1]: Reloading.
Jan 27 21:05:59 compute-1 systemd-rc-local-generator[139357]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:05:59 compute-1 systemd-sysv-generator[139360]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:05:59 compute-1 sudo[139324]: pam_unix(sudo:session): session closed for user root
Jan 27 21:05:59 compute-1 sudo[139515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwodgvnjjpikgmirlrlubwonlogdytdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547959.5566118-648-32399727415376/AnsiballZ_systemd.py'
Jan 27 21:05:59 compute-1 sudo[139515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:00 compute-1 python3.9[139517]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 21:06:00 compute-1 systemd[1]: Reloading.
Jan 27 21:06:00 compute-1 systemd-rc-local-generator[139548]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:06:00 compute-1 systemd-sysv-generator[139552]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:06:00 compute-1 sudo[139515]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:01 compute-1 sudo[139705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfrtffehuictlntubjblrhxdalxpmcfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547961.116888-706-136602756779878/AnsiballZ_systemd.py'
Jan 27 21:06:01 compute-1 sudo[139705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:01 compute-1 python3.9[139707]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:02 compute-1 systemd[1]: Reloading.
Jan 27 21:06:02 compute-1 systemd-rc-local-generator[139738]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:06:02 compute-1 systemd-sysv-generator[139741]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:06:03 compute-1 sudo[139705]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:03 compute-1 sudo[139895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwgythotqryqyxlzimasrnwjgzcgbnui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547963.2696655-706-205000664061588/AnsiballZ_systemd.py'
Jan 27 21:06:03 compute-1 sudo[139895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:03 compute-1 python3.9[139897]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:03 compute-1 systemd[1]: Reloading.
Jan 27 21:06:04 compute-1 systemd-rc-local-generator[139929]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:06:04 compute-1 systemd-sysv-generator[139933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:06:04 compute-1 sudo[139895]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:04 compute-1 sudo[140085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkztdsrlutgbiwqfoqiwugbrmavsayzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547964.4204493-706-72806747263528/AnsiballZ_systemd.py'
Jan 27 21:06:04 compute-1 sudo[140085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:05 compute-1 python3.9[140087]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:05 compute-1 systemd[1]: Reloading.
Jan 27 21:06:05 compute-1 systemd-rc-local-generator[140118]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:06:05 compute-1 systemd-sysv-generator[140122]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:06:05 compute-1 sudo[140085]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:05 compute-1 sudo[140275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nblhfqaiepucwsqwtizzwuktoqeodfjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547965.5694737-706-253388101335443/AnsiballZ_systemd.py'
Jan 27 21:06:05 compute-1 sudo[140275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:06 compute-1 python3.9[140277]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:06 compute-1 sudo[140275]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:06 compute-1 sudo[140430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwtehpyhgfuttoamniwdtlqkoqielldz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547966.5199084-706-236772028025891/AnsiballZ_systemd.py'
Jan 27 21:06:06 compute-1 sudo[140430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:07 compute-1 python3.9[140432]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:07 compute-1 systemd[1]: Reloading.
Jan 27 21:06:07 compute-1 systemd-sysv-generator[140468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:06:07 compute-1 systemd-rc-local-generator[140463]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:06:07 compute-1 sudo[140430]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:08 compute-1 sudo[140620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dohwpecuwbdxehmetraanrthqfuunjnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547967.8296816-778-153375460940401/AnsiballZ_systemd.py'
Jan 27 21:06:08 compute-1 sudo[140620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:08 compute-1 python3.9[140622]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 27 21:06:08 compute-1 systemd[1]: Reloading.
Jan 27 21:06:08 compute-1 systemd-sysv-generator[140656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:06:08 compute-1 systemd-rc-local-generator[140652]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:06:08 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 27 21:06:08 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 27 21:06:08 compute-1 sudo[140620]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:09 compute-1 sudo[140830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqjuuzgdnbswzhwjslqxifnbsbaaouws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547969.3045425-794-271118657535273/AnsiballZ_systemd.py'
Jan 27 21:06:09 compute-1 sudo[140830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:09 compute-1 podman[140787]: 2026-01-27 21:06:09.732098791 +0000 UTC m=+0.106300258 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Jan 27 21:06:10 compute-1 python3.9[140838]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:10 compute-1 sudo[140830]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:10 compute-1 sudo[140995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryjqiadzkqavosuthalnrrucebrodcne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547970.2345674-794-81447481527794/AnsiballZ_systemd.py'
Jan 27 21:06:10 compute-1 sudo[140995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:10 compute-1 python3.9[140997]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:10 compute-1 sudo[140995]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:06:11.137 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:06:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:06:11.138 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:06:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:06:11.138 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:06:11 compute-1 sudo[141151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jllyjlgtiyczlycedolklddxqycgxwek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547971.1018317-794-224457541633575/AnsiballZ_systemd.py'
Jan 27 21:06:11 compute-1 sudo[141151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:11 compute-1 python3.9[141153]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:11 compute-1 sudo[141151]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:12 compute-1 sudo[141306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hentsnqpklvlnrfricvuzrwcoiyqnsjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547971.940859-794-16857608977193/AnsiballZ_systemd.py'
Jan 27 21:06:12 compute-1 sudo[141306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:12 compute-1 python3.9[141308]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:12 compute-1 sudo[141306]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:13 compute-1 sudo[141461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmidobpzwaictsctxrbdbrlhpwsfqekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547972.7526193-794-168337607287146/AnsiballZ_systemd.py'
Jan 27 21:06:13 compute-1 sudo[141461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:13 compute-1 python3.9[141463]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:13 compute-1 sudo[141461]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:14 compute-1 sudo[141631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqykhuanlbxlsuzsjriskrrohfqbxtmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547973.6419184-794-42309350903344/AnsiballZ_systemd.py'
Jan 27 21:06:14 compute-1 sudo[141631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:14 compute-1 podman[141590]: 2026-01-27 21:06:14.030566806 +0000 UTC m=+0.068783949 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 21:06:14 compute-1 python3.9[141639]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:14 compute-1 sudo[141631]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:14 compute-1 sudo[141792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orodppqzomthchovancsyuqpfacmjsry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547974.5280418-794-13446236843138/AnsiballZ_systemd.py'
Jan 27 21:06:14 compute-1 sudo[141792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:15 compute-1 python3.9[141794]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:15 compute-1 sudo[141792]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:15 compute-1 sudo[141947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yczjmziydwtxfhzrtjrcuinmwtyqcupt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547975.3748245-794-99264422353069/AnsiballZ_systemd.py'
Jan 27 21:06:15 compute-1 sudo[141947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:15 compute-1 python3.9[141949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:16 compute-1 sudo[141947]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:16 compute-1 sudo[142102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpuacvdpsqzyeqxivwwdzfudbcqogfwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547976.250193-794-131877454486166/AnsiballZ_systemd.py'
Jan 27 21:06:16 compute-1 sudo[142102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:16 compute-1 python3.9[142104]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:16 compute-1 sudo[142102]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:17 compute-1 sudo[142257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lufdjeobbzdxffoxeugfawyuwuxogsej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547977.0413623-794-160452589850060/AnsiballZ_systemd.py'
Jan 27 21:06:17 compute-1 sudo[142257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:17 compute-1 python3.9[142259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:17 compute-1 sudo[142257]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:18 compute-1 sudo[142412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggulvlfrnwmlizlvsydzartcyqdkecrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547977.8988662-794-280738115519164/AnsiballZ_systemd.py'
Jan 27 21:06:18 compute-1 sudo[142412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:18 compute-1 python3.9[142414]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:18 compute-1 sudo[142412]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:19 compute-1 sudo[142567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxfuerdgkoqkkpapiqzinqpniwrezsmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547978.7151587-794-5624774285117/AnsiballZ_systemd.py'
Jan 27 21:06:19 compute-1 sudo[142567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:19 compute-1 python3.9[142569]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:19 compute-1 sudo[142567]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:19 compute-1 sudo[142722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiuvevwfbxfshkqcwlszvryjwiirhhrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547979.5741625-794-138483467849475/AnsiballZ_systemd.py'
Jan 27 21:06:19 compute-1 sudo[142722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:20 compute-1 python3.9[142724]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:20 compute-1 sudo[142722]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:20 compute-1 sudo[142877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eltxmozsfjbafawjzdnvmzsiigakrjqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547980.4258702-794-17807989269748/AnsiballZ_systemd.py'
Jan 27 21:06:20 compute-1 sudo[142877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:21 compute-1 python3.9[142879]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 27 21:06:22 compute-1 sudo[142877]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:23 compute-1 sudo[143032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alnowwxglivvguvrfumstsefnzrejqmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547982.736502-998-65644062556560/AnsiballZ_file.py'
Jan 27 21:06:23 compute-1 sudo[143032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:23 compute-1 python3.9[143034]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:06:23 compute-1 sudo[143032]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:23 compute-1 sudo[143184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyjokvgwlnqfrhqzekqrppbzksgifnjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547983.384866-998-225010419717385/AnsiballZ_file.py'
Jan 27 21:06:23 compute-1 sudo[143184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:23 compute-1 python3.9[143186]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:06:23 compute-1 sudo[143184]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:24 compute-1 sudo[143336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwyvfhfrxuumyqktalpuxfjapeedyvsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547983.997376-998-225447214789355/AnsiballZ_file.py'
Jan 27 21:06:24 compute-1 sudo[143336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:24 compute-1 python3.9[143338]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:06:24 compute-1 sudo[143336]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:24 compute-1 sudo[143488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlwsyhqonmwjnlczocgptdwhwjwewgwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547984.676435-998-67665684148016/AnsiballZ_file.py'
Jan 27 21:06:24 compute-1 sudo[143488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:25 compute-1 python3.9[143490]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:06:25 compute-1 sudo[143488]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:25 compute-1 sudo[143640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kayrxdzxdoxsaglrhxszipruijtybcgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547985.3080802-998-211756855039621/AnsiballZ_file.py'
Jan 27 21:06:25 compute-1 sudo[143640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:25 compute-1 python3.9[143642]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:06:25 compute-1 sudo[143640]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:26 compute-1 sudo[143792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsiwjozhtpiyavflxxsezfxydcavdzrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547985.971656-998-266779361612722/AnsiballZ_file.py'
Jan 27 21:06:26 compute-1 sudo[143792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:26 compute-1 python3.9[143794]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:06:26 compute-1 sudo[143792]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:27 compute-1 python3.9[143944]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:06:28 compute-1 sudo[144094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rettswyetxucilxupuekdzvdetkzgyxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547987.6891406-1100-173208441259684/AnsiballZ_stat.py'
Jan 27 21:06:28 compute-1 sudo[144094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:28 compute-1 python3.9[144096]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:28 compute-1 sudo[144094]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:28 compute-1 sudo[144219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmdrfoqdwfbbodreyyhxittjqmkrcrsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547987.6891406-1100-173208441259684/AnsiballZ_copy.py'
Jan 27 21:06:28 compute-1 sudo[144219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:29 compute-1 python3.9[144221]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547987.6891406-1100-173208441259684/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:29 compute-1 sudo[144219]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:29 compute-1 sudo[144371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnnoazrefyivgkkzzctkxyuqncchsvtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547989.246054-1100-267735057958584/AnsiballZ_stat.py'
Jan 27 21:06:29 compute-1 sudo[144371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:29 compute-1 python3.9[144373]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:29 compute-1 sudo[144371]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:30 compute-1 sudo[144496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbqjhgeshvlawzsudvroxbkkyqyqzoks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547989.246054-1100-267735057958584/AnsiballZ_copy.py'
Jan 27 21:06:30 compute-1 sudo[144496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:30 compute-1 python3.9[144498]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547989.246054-1100-267735057958584/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:30 compute-1 sudo[144496]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:30 compute-1 sudo[144648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvtujrfzkjwvqlpunulcnfogpzszvqnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547990.5676184-1100-169108135810135/AnsiballZ_stat.py'
Jan 27 21:06:30 compute-1 sudo[144648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:31 compute-1 python3.9[144650]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:31 compute-1 sudo[144648]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:31 compute-1 sudo[144773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmbsxhdclcxjpdjxlyknlbgryintyjhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547990.5676184-1100-169108135810135/AnsiballZ_copy.py'
Jan 27 21:06:31 compute-1 sudo[144773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:31 compute-1 python3.9[144775]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547990.5676184-1100-169108135810135/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:31 compute-1 sudo[144773]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:32 compute-1 sudo[144925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlaiwdmnriypcvhcnbomwyegvitaibhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547991.8864496-1100-134628841347407/AnsiballZ_stat.py'
Jan 27 21:06:32 compute-1 sudo[144925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:32 compute-1 python3.9[144927]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:32 compute-1 sudo[144925]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:32 compute-1 sudo[145050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltgfgzzybpiqomtbclkoemutjphorjrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547991.8864496-1100-134628841347407/AnsiballZ_copy.py'
Jan 27 21:06:32 compute-1 sudo[145050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:32 compute-1 python3.9[145052]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547991.8864496-1100-134628841347407/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:32 compute-1 sudo[145050]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:33 compute-1 sudo[145202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsezxfsjsvtwdxfqubtuhnkzffxzwowh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547992.9843986-1100-98323310018096/AnsiballZ_stat.py'
Jan 27 21:06:33 compute-1 sudo[145202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:33 compute-1 python3.9[145204]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:33 compute-1 sudo[145202]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:33 compute-1 sudo[145327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcenmosmnsqrtivwftgllcmjtavshand ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547992.9843986-1100-98323310018096/AnsiballZ_copy.py'
Jan 27 21:06:33 compute-1 sudo[145327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:34 compute-1 python3.9[145329]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547992.9843986-1100-98323310018096/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:34 compute-1 sudo[145327]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:34 compute-1 sudo[145479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggnzmnisfuwccymonxpbtdknndvqntkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547994.2646623-1100-42123501467292/AnsiballZ_stat.py'
Jan 27 21:06:34 compute-1 sudo[145479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:34 compute-1 python3.9[145481]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:34 compute-1 sudo[145479]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:35 compute-1 sudo[145604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdsqgahxumqzhriiwisraxevxrcqkulx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547994.2646623-1100-42123501467292/AnsiballZ_copy.py'
Jan 27 21:06:35 compute-1 sudo[145604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:35 compute-1 python3.9[145606]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547994.2646623-1100-42123501467292/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:35 compute-1 sudo[145604]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:35 compute-1 sudo[145756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhoknidrzapxkvritcmrpzkexicsngsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547995.5487344-1100-101009995300788/AnsiballZ_stat.py'
Jan 27 21:06:35 compute-1 sudo[145756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:36 compute-1 python3.9[145758]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:36 compute-1 sudo[145756]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:36 compute-1 sudo[145879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlaoekjcetwnchakuvlusztnjutkpswd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547995.5487344-1100-101009995300788/AnsiballZ_copy.py'
Jan 27 21:06:36 compute-1 sudo[145879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:36 compute-1 python3.9[145881]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547995.5487344-1100-101009995300788/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:36 compute-1 sudo[145879]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:37 compute-1 sudo[146031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xljxgqljujhbmseamycpwqjiburipabp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547996.8222353-1100-77551368373063/AnsiballZ_stat.py'
Jan 27 21:06:37 compute-1 sudo[146031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:37 compute-1 python3.9[146033]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:37 compute-1 sudo[146031]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:37 compute-1 sudo[146156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfrsezqwgvaddvwnssuegyagcqfvxzrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547996.8222353-1100-77551368373063/AnsiballZ_copy.py'
Jan 27 21:06:37 compute-1 sudo[146156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:37 compute-1 python3.9[146158]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769547996.8222353-1100-77551368373063/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:37 compute-1 sudo[146156]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:38 compute-1 sudo[146308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqtyvlvuglcyvrrdteboswfukwlwvfiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547998.3269458-1326-265776931990849/AnsiballZ_command.py'
Jan 27 21:06:38 compute-1 sudo[146308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:38 compute-1 python3.9[146310]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 27 21:06:38 compute-1 sudo[146308]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:39 compute-1 sudo[146461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhlhjhsxuxuidiaftrizidyuhfinrnqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547999.1612015-1344-134730360229007/AnsiballZ_file.py'
Jan 27 21:06:39 compute-1 sudo[146461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:39 compute-1 python3.9[146463]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:39 compute-1 sudo[146461]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:40 compute-1 sudo[146629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxbagbjyyrwucigsbgwfgwzzlcchcvlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769547999.7937977-1344-253259939926570/AnsiballZ_file.py'
Jan 27 21:06:40 compute-1 sudo[146629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:40 compute-1 podman[146587]: 2026-01-27 21:06:40.16412983 +0000 UTC m=+0.101307949 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 27 21:06:40 compute-1 python3.9[146636]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:40 compute-1 sudo[146629]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:40 compute-1 sudo[146792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmaibxwmgowizcueagcysmjthquxqtnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548000.491048-1344-131004132674636/AnsiballZ_file.py'
Jan 27 21:06:40 compute-1 sudo[146792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:40 compute-1 python3.9[146794]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:40 compute-1 sudo[146792]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:41 compute-1 sudo[146944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihfjofzttcwmlkdaekfmkhswyjcyjxav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548001.0859451-1344-165801239080186/AnsiballZ_file.py'
Jan 27 21:06:41 compute-1 sudo[146944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:41 compute-1 python3.9[146946]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:41 compute-1 sudo[146944]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:42 compute-1 sudo[147096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxkxlvfvteapevwgnksxhqcosnchyjrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548001.7216237-1344-230024126444227/AnsiballZ_file.py'
Jan 27 21:06:42 compute-1 sudo[147096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:42 compute-1 python3.9[147098]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:42 compute-1 sudo[147096]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:42 compute-1 sudo[147248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixlbyumkemsucqbehkwntxktodldxurd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548002.3623376-1344-122373673116480/AnsiballZ_file.py'
Jan 27 21:06:42 compute-1 sudo[147248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:42 compute-1 python3.9[147250]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:42 compute-1 sudo[147248]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:43 compute-1 sudo[147400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crhlqskcgxxxymrbhkkwggiridbvthsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548003.166145-1344-51980610584829/AnsiballZ_file.py'
Jan 27 21:06:43 compute-1 sudo[147400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:43 compute-1 python3.9[147402]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:43 compute-1 sudo[147400]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:44 compute-1 podman[147526]: 2026-01-27 21:06:44.159772965 +0000 UTC m=+0.059699184 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 21:06:44 compute-1 sudo[147569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emzkgkpzteefcizaeeiuvwszqculprxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548003.862463-1344-217028305796619/AnsiballZ_file.py'
Jan 27 21:06:44 compute-1 sudo[147569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:44 compute-1 python3.9[147573]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:44 compute-1 sudo[147569]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:44 compute-1 sudo[147724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnsbfxbovyhqhuuevxsupondhojqltgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548004.515775-1344-123704667293250/AnsiballZ_file.py'
Jan 27 21:06:44 compute-1 sudo[147724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:44 compute-1 python3.9[147726]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:44 compute-1 sudo[147724]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:45 compute-1 sudo[147876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohkpsjbpmxiwxhjfrrfuubfnpsnetuql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548005.1246371-1344-249527852042384/AnsiballZ_file.py'
Jan 27 21:06:45 compute-1 sudo[147876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:45 compute-1 python3.9[147878]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:45 compute-1 sudo[147876]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:46 compute-1 sudo[148028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owjrefrsscimploofmfarzarpgdbkahy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548005.8181236-1344-38248830418182/AnsiballZ_file.py'
Jan 27 21:06:46 compute-1 sudo[148028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:46 compute-1 python3.9[148030]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:46 compute-1 sudo[148028]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:46 compute-1 sudo[148180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycatgxpgjyikizsioiuwizjiuqscwtby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548006.511778-1344-70756457101355/AnsiballZ_file.py'
Jan 27 21:06:46 compute-1 sudo[148180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:46 compute-1 python3.9[148182]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:47 compute-1 sudo[148180]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:47 compute-1 sudo[148332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woxgstwiiqbgqlxkwuokjwgbprafwkib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548007.1359758-1344-137110221684239/AnsiballZ_file.py'
Jan 27 21:06:47 compute-1 sudo[148332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:47 compute-1 python3.9[148334]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:47 compute-1 sudo[148332]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:48 compute-1 sudo[148484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiadrxzheahrqivzxhcxfliqwogclikj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548007.7867796-1344-32971892938464/AnsiballZ_file.py'
Jan 27 21:06:48 compute-1 sudo[148484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:48 compute-1 python3.9[148486]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:48 compute-1 sudo[148484]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:48 compute-1 sudo[148636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivmsvynvfaijlnowjsyxpcfyortveqhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548008.6638532-1542-183680927225727/AnsiballZ_stat.py'
Jan 27 21:06:48 compute-1 sudo[148636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:49 compute-1 python3.9[148638]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:49 compute-1 sudo[148636]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:49 compute-1 sudo[148759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucybowfxarljvrcsxhuaeyhweudtaham ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548008.6638532-1542-183680927225727/AnsiballZ_copy.py'
Jan 27 21:06:49 compute-1 sudo[148759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:49 compute-1 python3.9[148761]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548008.6638532-1542-183680927225727/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:49 compute-1 sudo[148759]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:50 compute-1 sudo[148911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxyhcssalhbtcdbpiyheeewkgumqjbpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548009.8333821-1542-204172592298812/AnsiballZ_stat.py'
Jan 27 21:06:50 compute-1 sudo[148911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:50 compute-1 python3.9[148913]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:50 compute-1 sudo[148911]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:50 compute-1 sudo[149034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqawxgxdxdoyxyzpjirtnzrtcnrnuiqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548009.8333821-1542-204172592298812/AnsiballZ_copy.py'
Jan 27 21:06:50 compute-1 sudo[149034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:50 compute-1 python3.9[149036]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548009.8333821-1542-204172592298812/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:50 compute-1 sudo[149034]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:51 compute-1 sudo[149186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivcycjpzzegncrradkcdnjevunderdlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548011.0974085-1542-18789052705015/AnsiballZ_stat.py'
Jan 27 21:06:51 compute-1 sudo[149186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:51 compute-1 python3.9[149188]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:51 compute-1 sudo[149186]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:51 compute-1 sudo[149309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-budphuudkzmotirtwxgbgsuctowzugny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548011.0974085-1542-18789052705015/AnsiballZ_copy.py'
Jan 27 21:06:51 compute-1 sudo[149309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:52 compute-1 python3.9[149311]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548011.0974085-1542-18789052705015/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:52 compute-1 sudo[149309]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:52 compute-1 sudo[149461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwjbeweuxoiutsxwkuajzolaxsretenh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548012.2214158-1542-57096973753576/AnsiballZ_stat.py'
Jan 27 21:06:52 compute-1 sudo[149461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:52 compute-1 python3.9[149463]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:52 compute-1 sudo[149461]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:53 compute-1 sudo[149584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoxlnprkcsnrweunmuzynjsxmsbrodxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548012.2214158-1542-57096973753576/AnsiballZ_copy.py'
Jan 27 21:06:53 compute-1 sudo[149584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:53 compute-1 python3.9[149586]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548012.2214158-1542-57096973753576/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:53 compute-1 sudo[149584]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:53 compute-1 sudo[149736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyyladrgxqimqmgqkpsqcriyfxkkhdss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548013.3997962-1542-27224320522218/AnsiballZ_stat.py'
Jan 27 21:06:53 compute-1 sudo[149736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:53 compute-1 python3.9[149738]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:53 compute-1 sudo[149736]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:54 compute-1 sudo[149859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okfboqotzvjsflhfqqkzgobwacvjxobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548013.3997962-1542-27224320522218/AnsiballZ_copy.py'
Jan 27 21:06:54 compute-1 sudo[149859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:54 compute-1 python3.9[149861]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548013.3997962-1542-27224320522218/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:54 compute-1 sudo[149859]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:54 compute-1 sudo[150011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkikxlupjyubxhtngdyoshszfshnecsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548014.6321492-1542-164916247693158/AnsiballZ_stat.py'
Jan 27 21:06:54 compute-1 sudo[150011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:55 compute-1 python3.9[150013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:55 compute-1 sudo[150011]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:55 compute-1 sudo[150134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzhoerlfhvdvgncjjozzsnlnvmjzzsyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548014.6321492-1542-164916247693158/AnsiballZ_copy.py'
Jan 27 21:06:55 compute-1 sudo[150134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:55 compute-1 python3.9[150136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548014.6321492-1542-164916247693158/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:55 compute-1 sudo[150134]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:56 compute-1 sudo[150286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohnlghflehhzopvexdxoqzmyibetyse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548015.886323-1542-151293160521030/AnsiballZ_stat.py'
Jan 27 21:06:56 compute-1 sudo[150286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:56 compute-1 python3.9[150288]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:56 compute-1 sudo[150286]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:56 compute-1 sudo[150409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhjybbuzqdnwlfzqwfguwqxjjwehftnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548015.886323-1542-151293160521030/AnsiballZ_copy.py'
Jan 27 21:06:56 compute-1 sudo[150409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:56 compute-1 python3.9[150411]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548015.886323-1542-151293160521030/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:56 compute-1 sudo[150409]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:57 compute-1 sudo[150561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdbbzqcywnznvtmjowzgemxarantkdca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548017.1456122-1542-246317284482407/AnsiballZ_stat.py'
Jan 27 21:06:57 compute-1 sudo[150561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:57 compute-1 python3.9[150563]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:57 compute-1 sudo[150561]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:58 compute-1 sudo[150684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npckyjzqbhdmcmvstkpblyzvegpmhqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548017.1456122-1542-246317284482407/AnsiballZ_copy.py'
Jan 27 21:06:58 compute-1 sudo[150684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:58 compute-1 python3.9[150686]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548017.1456122-1542-246317284482407/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:58 compute-1 sudo[150684]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:58 compute-1 sudo[150836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-payikqnokfdoquvhtguwomseqhisuyjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548018.483412-1542-224767619822859/AnsiballZ_stat.py'
Jan 27 21:06:58 compute-1 sudo[150836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:58 compute-1 python3.9[150838]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:06:59 compute-1 sudo[150836]: pam_unix(sudo:session): session closed for user root
Jan 27 21:06:59 compute-1 sudo[150959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vakfnloxlcogzigrdtturlprgodnszdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548018.483412-1542-224767619822859/AnsiballZ_copy.py'
Jan 27 21:06:59 compute-1 sudo[150959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:06:59 compute-1 python3.9[150961]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548018.483412-1542-224767619822859/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:06:59 compute-1 sudo[150959]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:00 compute-1 sudo[151111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlhewzpvfmmermveldhdlasmnjbcukjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548020.0067613-1542-226237155093084/AnsiballZ_stat.py'
Jan 27 21:07:00 compute-1 sudo[151111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:00 compute-1 python3.9[151113]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:00 compute-1 sudo[151111]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:01 compute-1 sudo[151234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzawbtmtjgugimiziyklaneggrjdnxol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548020.0067613-1542-226237155093084/AnsiballZ_copy.py'
Jan 27 21:07:01 compute-1 sudo[151234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:01 compute-1 python3.9[151236]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548020.0067613-1542-226237155093084/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:01 compute-1 sudo[151234]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:01 compute-1 sudo[151386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkzvcvcfslcicljpfjwtatpetclbympy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548021.6089914-1542-35541971375056/AnsiballZ_stat.py'
Jan 27 21:07:01 compute-1 sudo[151386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:02 compute-1 python3.9[151388]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:02 compute-1 sudo[151386]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:02 compute-1 sudo[151509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avidijjefhsdvlycqdkpwveiywpcrova ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548021.6089914-1542-35541971375056/AnsiballZ_copy.py'
Jan 27 21:07:02 compute-1 sudo[151509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:02 compute-1 python3.9[151511]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548021.6089914-1542-35541971375056/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:02 compute-1 sudo[151509]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:03 compute-1 sudo[151661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdaktmrhjvdurvlntcddpbiwlefkodwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548022.809053-1542-189727607649163/AnsiballZ_stat.py'
Jan 27 21:07:03 compute-1 sudo[151661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:03 compute-1 python3.9[151663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:03 compute-1 sudo[151661]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:03 compute-1 sudo[151784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzgafrxjenroegzhdimrfzgrmlihfsow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548022.809053-1542-189727607649163/AnsiballZ_copy.py'
Jan 27 21:07:03 compute-1 sudo[151784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:04 compute-1 python3.9[151786]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548022.809053-1542-189727607649163/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:04 compute-1 sudo[151784]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:04 compute-1 sudo[151936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agcxwukftfszhvocmreenmikchfextmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548024.3557467-1542-109463254248788/AnsiballZ_stat.py'
Jan 27 21:07:04 compute-1 sudo[151936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:04 compute-1 python3.9[151938]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:04 compute-1 sudo[151936]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:05 compute-1 sudo[152059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzlemwxswpvvjbejluyjwqfbksxdzvud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548024.3557467-1542-109463254248788/AnsiballZ_copy.py'
Jan 27 21:07:05 compute-1 sudo[152059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:05 compute-1 python3.9[152061]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548024.3557467-1542-109463254248788/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:05 compute-1 sudo[152059]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:06 compute-1 sudo[152211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfuzwawaynlbhhoyqtgzzflcrpjmcnyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548025.938604-1542-196523688734644/AnsiballZ_stat.py'
Jan 27 21:07:06 compute-1 sudo[152211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:06 compute-1 python3.9[152213]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:06 compute-1 sudo[152211]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:06 compute-1 sudo[152334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piyejceepzglomwvdgtbqcxwqqlzybgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548025.938604-1542-196523688734644/AnsiballZ_copy.py'
Jan 27 21:07:06 compute-1 sudo[152334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:07 compute-1 python3.9[152336]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548025.938604-1542-196523688734644/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:07 compute-1 sudo[152334]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:08 compute-1 python3.9[152486]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:07:09 compute-1 sudo[152639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcbcysmuctqihqpxgerzkwiyyxnsfnwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548028.6717162-1954-51491067452803/AnsiballZ_seboolean.py'
Jan 27 21:07:09 compute-1 sudo[152639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:09 compute-1 python3.9[152641]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 27 21:07:10 compute-1 sudo[152639]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:10 compute-1 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 27 21:07:10 compute-1 podman[152670]: 2026-01-27 21:07:10.778452738 +0000 UTC m=+0.088998799 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true)
Jan 27 21:07:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:07:11.139 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:07:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:07:11.139 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:07:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:07:11.140 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:07:11 compute-1 sudo[152823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwoqmrpffbcvssoifaywjksyxumqdzpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548031.3223588-1970-20935203477554/AnsiballZ_copy.py'
Jan 27 21:07:11 compute-1 sudo[152823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:11 compute-1 python3.9[152825]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:11 compute-1 sudo[152823]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:12 compute-1 sudo[152975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytdbgxnweimyrvkuakijozjnbsmakbvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548031.9550257-1970-97176796825362/AnsiballZ_copy.py'
Jan 27 21:07:12 compute-1 sudo[152975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:12 compute-1 python3.9[152977]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:12 compute-1 sudo[152975]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:12 compute-1 sudo[153127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frvzhdpobetwosqvewadagchxyabacek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548032.5968952-1970-53339831065424/AnsiballZ_copy.py'
Jan 27 21:07:12 compute-1 sudo[153127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:13 compute-1 python3.9[153129]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:13 compute-1 sudo[153127]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:14 compute-1 sudo[153279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ablwmbnoatipyefsjgqjqdyvctqshrbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548033.7359998-1970-163810095011220/AnsiballZ_copy.py'
Jan 27 21:07:14 compute-1 sudo[153279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:14 compute-1 python3.9[153281]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:14 compute-1 sudo[153279]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:14 compute-1 podman[153329]: 2026-01-27 21:07:14.737777169 +0000 UTC m=+0.052287520 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 27 21:07:15 compute-1 sudo[153450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udoepcvphyvkrwggiwyvmrbiqmzuemnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548034.6493385-1970-184175117762984/AnsiballZ_copy.py'
Jan 27 21:07:15 compute-1 sudo[153450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:15 compute-1 python3.9[153452]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:15 compute-1 sudo[153450]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:16 compute-1 sudo[153602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yumqsdiuavdtxlabtijdpkmxtqkgesgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548035.6468127-2042-92349526012918/AnsiballZ_copy.py'
Jan 27 21:07:16 compute-1 sudo[153602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:16 compute-1 python3.9[153604]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:16 compute-1 sudo[153602]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:17 compute-1 sudo[153754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nogtsitxctawoawvbleunocsbuaeekin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548036.8007233-2042-64761807784587/AnsiballZ_copy.py'
Jan 27 21:07:17 compute-1 sudo[153754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:17 compute-1 python3.9[153756]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:17 compute-1 sudo[153754]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:17 compute-1 sudo[153906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbafojvpwcaxsflacchqykqanknhxrgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548037.4586036-2042-174286662234384/AnsiballZ_copy.py'
Jan 27 21:07:17 compute-1 sudo[153906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:17 compute-1 python3.9[153908]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:17 compute-1 sudo[153906]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:18 compute-1 sudo[154058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqawjvoymdmuwryoxqvsibkysfqglsow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548038.345419-2042-60542396606366/AnsiballZ_copy.py'
Jan 27 21:07:18 compute-1 sudo[154058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:18 compute-1 python3.9[154060]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:18 compute-1 sudo[154058]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:19 compute-1 sudo[154210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpszcbaouxwqphgzinbcygggcthwsyjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548039.1917953-2042-62625856713726/AnsiballZ_copy.py'
Jan 27 21:07:19 compute-1 sudo[154210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:19 compute-1 python3.9[154212]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:19 compute-1 sudo[154210]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:20 compute-1 sudo[154362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qckcpxstgrimukiasexmgwtykijwgkup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548040.1266847-2114-38947015743001/AnsiballZ_systemd.py'
Jan 27 21:07:20 compute-1 sudo[154362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:20 compute-1 python3.9[154364]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:07:20 compute-1 systemd[1]: Reloading.
Jan 27 21:07:20 compute-1 systemd-rc-local-generator[154391]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:20 compute-1 systemd-sysv-generator[154394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:20 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Jan 27 21:07:21 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Jan 27 21:07:21 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 27 21:07:21 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 27 21:07:21 compute-1 systemd[1]: Starting libvirt logging daemon...
Jan 27 21:07:21 compute-1 systemd[1]: Started libvirt logging daemon.
Jan 27 21:07:21 compute-1 sudo[154362]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:21 compute-1 sudo[154555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zedprniysmjgjlzxzlvzxpnwwkzdsjqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548041.3350272-2114-226528868653783/AnsiballZ_systemd.py'
Jan 27 21:07:21 compute-1 sudo[154555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:21 compute-1 python3.9[154557]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:07:21 compute-1 systemd[1]: Reloading.
Jan 27 21:07:22 compute-1 systemd-sysv-generator[154586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:22 compute-1 systemd-rc-local-generator[154583]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:22 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 27 21:07:22 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 27 21:07:22 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 27 21:07:22 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 27 21:07:22 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 27 21:07:22 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 27 21:07:22 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 27 21:07:22 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 27 21:07:22 compute-1 sudo[154555]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:22 compute-1 sudo[154771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kigfgoxqhjmaqigkosowllwwwmlhcgtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548042.4772618-2114-11995766997733/AnsiballZ_systemd.py'
Jan 27 21:07:22 compute-1 sudo[154771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:23 compute-1 python3.9[154773]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:07:23 compute-1 systemd[1]: Reloading.
Jan 27 21:07:23 compute-1 systemd-sysv-generator[154805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:23 compute-1 systemd-rc-local-generator[154802]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:23 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 27 21:07:23 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 27 21:07:23 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 27 21:07:23 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 27 21:07:23 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 27 21:07:23 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 27 21:07:23 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 27 21:07:23 compute-1 sudo[154771]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:23 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 27 21:07:23 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 27 21:07:23 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 27 21:07:23 compute-1 sudo[154991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qviypcnbfxazraxvcugwnasdrpdwgdiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548043.6353054-2114-274977088934336/AnsiballZ_systemd.py'
Jan 27 21:07:23 compute-1 sudo[154991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:24 compute-1 python3.9[154993]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:07:24 compute-1 systemd[1]: Reloading.
Jan 27 21:07:24 compute-1 systemd-sysv-generator[155026]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:24 compute-1 systemd-rc-local-generator[155022]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:24 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Jan 27 21:07:24 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 27 21:07:24 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 27 21:07:24 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 27 21:07:24 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 27 21:07:24 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 27 21:07:24 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 27 21:07:24 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 27 21:07:24 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 27 21:07:24 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 27 21:07:24 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 27 21:07:24 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 27 21:07:24 compute-1 sudo[154991]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:24 compute-1 setroubleshoot[154811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 08a46f5d-8ca1-4385-805e-fb8c32c7d4e2
Jan 27 21:07:24 compute-1 setroubleshoot[154811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 27 21:07:24 compute-1 setroubleshoot[154811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 08a46f5d-8ca1-4385-805e-fb8c32c7d4e2
Jan 27 21:07:24 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 21:07:24 compute-1 setroubleshoot[154811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 27 21:07:25 compute-1 sudo[155209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahjcgbrrrhwhqdcnmeirbkdgoifiexmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548044.8331468-2114-267643110141160/AnsiballZ_systemd.py'
Jan 27 21:07:25 compute-1 sudo[155209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:25 compute-1 python3.9[155211]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:07:25 compute-1 systemd[1]: Reloading.
Jan 27 21:07:25 compute-1 systemd-rc-local-generator[155238]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:25 compute-1 systemd-sysv-generator[155241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:25 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Jan 27 21:07:25 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Jan 27 21:07:25 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 27 21:07:25 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 27 21:07:25 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 27 21:07:25 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 27 21:07:25 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 27 21:07:25 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 27 21:07:25 compute-1 sudo[155209]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:26 compute-1 sudo[155421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdoazuycnnqhfxnuiaanakneyyobgpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548046.3634768-2188-275754342043403/AnsiballZ_file.py'
Jan 27 21:07:26 compute-1 sudo[155421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:26 compute-1 python3.9[155423]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:26 compute-1 sudo[155421]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:27 compute-1 sudo[155573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rudiqdimjjvknfckzmeqbtqwxayhekwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548047.1016011-2204-228706908566115/AnsiballZ_find.py'
Jan 27 21:07:27 compute-1 sudo[155573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:27 compute-1 python3.9[155575]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 27 21:07:27 compute-1 sudo[155573]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:28 compute-1 sudo[155725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxuibeethncmewmblawjkvzayhmbrnxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548048.0804684-2232-101366805054946/AnsiballZ_stat.py'
Jan 27 21:07:28 compute-1 sudo[155725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:28 compute-1 python3.9[155727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:28 compute-1 sudo[155725]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:28 compute-1 sudo[155848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gawytkdqezgtcfboxqkwskzgmeukschv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548048.0804684-2232-101366805054946/AnsiballZ_copy.py'
Jan 27 21:07:28 compute-1 sudo[155848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:29 compute-1 python3.9[155850]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548048.0804684-2232-101366805054946/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:29 compute-1 sudo[155848]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:29 compute-1 sudo[156000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzwwkscgwtirilosdbfloaroaspfpwqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548049.6508825-2264-103605027785410/AnsiballZ_file.py'
Jan 27 21:07:29 compute-1 sudo[156000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:30 compute-1 python3.9[156002]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:30 compute-1 sudo[156000]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:30 compute-1 sudo[156152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frnzclsaowzfptsnkyrpmdxbdlimbwkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548050.3435712-2280-196135185790314/AnsiballZ_stat.py'
Jan 27 21:07:30 compute-1 sudo[156152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:30 compute-1 python3.9[156154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:30 compute-1 sudo[156152]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:31 compute-1 sudo[156230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajffxcwoprnooptlvtbsccwhrsavlpbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548050.3435712-2280-196135185790314/AnsiballZ_file.py'
Jan 27 21:07:31 compute-1 sudo[156230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:31 compute-1 python3.9[156232]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:31 compute-1 sudo[156230]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:31 compute-1 sudo[156382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojswgoplrgsyajwjfdgvojycopgyxkbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548051.4893918-2304-214470864115750/AnsiballZ_stat.py'
Jan 27 21:07:31 compute-1 sudo[156382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:31 compute-1 python3.9[156384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:32 compute-1 sudo[156382]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:32 compute-1 sudo[156460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohnlygqetfvbesojbfwluzhljtqnkqsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548051.4893918-2304-214470864115750/AnsiballZ_file.py'
Jan 27 21:07:32 compute-1 sudo[156460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:32 compute-1 python3.9[156462]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.slmzx5vt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:32 compute-1 sudo[156460]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:32 compute-1 sudo[156612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raausrcfowixlijpzrdhdbvlmagfywui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548052.6283538-2328-258546765169115/AnsiballZ_stat.py'
Jan 27 21:07:32 compute-1 sudo[156612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:33 compute-1 python3.9[156614]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:33 compute-1 sudo[156612]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:33 compute-1 sudo[156690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfsbypazrbuoojozuwhtjugynxopzmck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548052.6283538-2328-258546765169115/AnsiballZ_file.py'
Jan 27 21:07:33 compute-1 sudo[156690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:33 compute-1 python3.9[156692]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:33 compute-1 sudo[156690]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:34 compute-1 sudo[156842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grlceuzvvqxcptpobfzzoxobpumhmfuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548053.865494-2354-237074958839082/AnsiballZ_command.py'
Jan 27 21:07:34 compute-1 sudo[156842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:34 compute-1 python3.9[156844]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:07:34 compute-1 sudo[156842]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:34 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 27 21:07:34 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.003s CPU time.
Jan 27 21:07:34 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 27 21:07:35 compute-1 sudo[156995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faccilsgcwehwqdqfxibonehlbgyvahy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769548054.713563-2370-248825189563939/AnsiballZ_edpm_nftables_from_files.py'
Jan 27 21:07:35 compute-1 sudo[156995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:35 compute-1 python3[156997]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 27 21:07:35 compute-1 sudo[156995]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:36 compute-1 sudo[157147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsanumapduigvhwjlqoyztdrxjkwhqnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548055.7510529-2386-208885716683177/AnsiballZ_stat.py'
Jan 27 21:07:36 compute-1 sudo[157147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:36 compute-1 python3.9[157149]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:36 compute-1 sudo[157147]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:36 compute-1 sudo[157225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thoqqrxfwgkpyzmwazompvhrofsbhssi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548055.7510529-2386-208885716683177/AnsiballZ_file.py'
Jan 27 21:07:36 compute-1 sudo[157225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:36 compute-1 python3.9[157227]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:36 compute-1 sudo[157225]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:37 compute-1 sudo[157377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwsmpgqbksldtsfwsynwfhyfwgycdeze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548057.2178898-2410-138847887608370/AnsiballZ_stat.py'
Jan 27 21:07:37 compute-1 sudo[157377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:37 compute-1 python3.9[157379]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:37 compute-1 sudo[157377]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:38 compute-1 sudo[157502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkicshhaljhiezgfyjbnzoqumdesdpup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548057.2178898-2410-138847887608370/AnsiballZ_copy.py'
Jan 27 21:07:38 compute-1 sudo[157502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:38 compute-1 python3.9[157504]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548057.2178898-2410-138847887608370/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:38 compute-1 sudo[157502]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:38 compute-1 sudo[157654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbpurjrqgijssmvfgqlgjowvnxsifmba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548058.5508137-2440-83175933363145/AnsiballZ_stat.py'
Jan 27 21:07:38 compute-1 sudo[157654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:39 compute-1 python3.9[157656]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:39 compute-1 sudo[157654]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:39 compute-1 sudo[157732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibznmwkqhiictxsuwduxshoaecdvhmpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548058.5508137-2440-83175933363145/AnsiballZ_file.py'
Jan 27 21:07:39 compute-1 sudo[157732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:39 compute-1 python3.9[157734]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:39 compute-1 sudo[157732]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:40 compute-1 sudo[157884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbbsrnllcfyjkdpzxwigyvjyqnsxovnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548059.7793982-2464-231323953795560/AnsiballZ_stat.py'
Jan 27 21:07:40 compute-1 sudo[157884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:40 compute-1 python3.9[157886]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:40 compute-1 sudo[157884]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:40 compute-1 sudo[157962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzufkyxnmniluytkzlzrndryqzzoecqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548059.7793982-2464-231323953795560/AnsiballZ_file.py'
Jan 27 21:07:40 compute-1 sudo[157962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:40 compute-1 python3.9[157964]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:40 compute-1 sudo[157962]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:41 compute-1 sudo[158127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqkzevnxrwciqisozexoketmmkjokvoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548060.9853647-2489-194554743494735/AnsiballZ_stat.py'
Jan 27 21:07:41 compute-1 sudo[158127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:41 compute-1 podman[158088]: 2026-01-27 21:07:41.433292122 +0000 UTC m=+0.101846094 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Jan 27 21:07:41 compute-1 python3.9[158136]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:41 compute-1 sudo[158127]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:42 compute-1 sudo[158265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siedkubuiokfllsjplcadmatkftlgrts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548060.9853647-2489-194554743494735/AnsiballZ_copy.py'
Jan 27 21:07:42 compute-1 sudo[158265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:42 compute-1 python3.9[158267]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548060.9853647-2489-194554743494735/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:42 compute-1 sudo[158265]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:42 compute-1 sudo[158417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpggewxikfqqwmtjeoielkzykuqhzufd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548062.6196227-2518-73673424937306/AnsiballZ_file.py'
Jan 27 21:07:42 compute-1 sudo[158417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:43 compute-1 python3.9[158419]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:43 compute-1 sudo[158417]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:43 compute-1 sudo[158569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmlpdfmknltgjkmjtwuvjgtuxclfidlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548063.3568878-2534-25114069308237/AnsiballZ_command.py'
Jan 27 21:07:43 compute-1 sudo[158569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:43 compute-1 python3.9[158571]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:07:43 compute-1 sudo[158569]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:45 compute-1 sudo[158739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rogedoogvhtkjizfozldkcubhvsvyzas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548064.070909-2550-254783940930637/AnsiballZ_blockinfile.py'
Jan 27 21:07:45 compute-1 sudo[158739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:45 compute-1 podman[158698]: 2026-01-27 21:07:45.083505729 +0000 UTC m=+0.070548522 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 27 21:07:45 compute-1 python3.9[158742]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:45 compute-1 sudo[158739]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:45 compute-1 sudo[158892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzfahlnnhuawzcxvzphalqxpqobcmqtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548065.5861132-2568-163651814267968/AnsiballZ_command.py'
Jan 27 21:07:45 compute-1 sudo[158892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:46 compute-1 python3.9[158894]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:07:46 compute-1 sudo[158892]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:46 compute-1 sudo[159045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dddgpimypwsctktnislhjjtlnlcxdqhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548066.3511288-2584-138778401557769/AnsiballZ_stat.py'
Jan 27 21:07:46 compute-1 sudo[159045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:46 compute-1 python3.9[159047]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:07:46 compute-1 sudo[159045]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:47 compute-1 sudo[159199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uumnhtymfvyaikracsmgxdjigmalgoee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548067.0140567-2600-152829446456857/AnsiballZ_command.py'
Jan 27 21:07:47 compute-1 sudo[159199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:47 compute-1 python3.9[159201]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:07:47 compute-1 sudo[159199]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:48 compute-1 sudo[159354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gupgpswkrjocgwhkqrwbnzdxoottifpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548067.8837774-2616-72651466905568/AnsiballZ_file.py'
Jan 27 21:07:48 compute-1 sudo[159354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:48 compute-1 python3.9[159356]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:48 compute-1 sudo[159354]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:49 compute-1 sudo[159506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpdxaxpbdwripdvkplnyqhegqkkbnnwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548068.8947465-2632-2581523498153/AnsiballZ_stat.py'
Jan 27 21:07:49 compute-1 sudo[159506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:49 compute-1 python3.9[159508]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:49 compute-1 sudo[159506]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:49 compute-1 sudo[159629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eshqvwswgxpdzedhiynlzbqjsuxpgids ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548068.8947465-2632-2581523498153/AnsiballZ_copy.py'
Jan 27 21:07:49 compute-1 sudo[159629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:50 compute-1 python3.9[159631]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548068.8947465-2632-2581523498153/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:50 compute-1 sudo[159629]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:50 compute-1 sudo[159781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tanxwtklmlogjxhkfsvvayrciddvgoby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548070.5096583-2662-61673115084573/AnsiballZ_stat.py'
Jan 27 21:07:50 compute-1 sudo[159781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:50 compute-1 python3.9[159783]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:50 compute-1 sudo[159781]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:51 compute-1 sudo[159904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vavahsjbvqkxjivlfmiymadjndifpxtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548070.5096583-2662-61673115084573/AnsiballZ_copy.py'
Jan 27 21:07:51 compute-1 sudo[159904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:51 compute-1 python3.9[159906]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548070.5096583-2662-61673115084573/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:51 compute-1 sudo[159904]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:52 compute-1 sudo[160056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhilbrlriitsijqjgenuunsgugnojzmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548071.7540016-2692-184547731388483/AnsiballZ_stat.py'
Jan 27 21:07:52 compute-1 sudo[160056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:52 compute-1 python3.9[160058]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:07:52 compute-1 sudo[160056]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:52 compute-1 sudo[160179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zprmumlehtpvmoislhpyfjeyzelrlaur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548071.7540016-2692-184547731388483/AnsiballZ_copy.py'
Jan 27 21:07:52 compute-1 sudo[160179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:52 compute-1 python3.9[160181]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548071.7540016-2692-184547731388483/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:07:52 compute-1 sudo[160179]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:53 compute-1 sudo[160331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igllmclrombsaxelodncrgvtmookbrky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548073.178061-2722-259453277295113/AnsiballZ_systemd.py'
Jan 27 21:07:53 compute-1 sudo[160331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:54 compute-1 python3.9[160333]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:07:54 compute-1 systemd[1]: Reloading.
Jan 27 21:07:54 compute-1 systemd-rc-local-generator[160361]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:54 compute-1 systemd-sysv-generator[160366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:54 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Jan 27 21:07:54 compute-1 sudo[160331]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:55 compute-1 sudo[160521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbgvliythrecyidmxfbraquehgflctsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548074.753287-2738-14897375421416/AnsiballZ_systemd.py'
Jan 27 21:07:55 compute-1 sudo[160521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:07:55 compute-1 python3.9[160523]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 27 21:07:55 compute-1 systemd[1]: Reloading.
Jan 27 21:07:55 compute-1 systemd-rc-local-generator[160549]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:55 compute-1 systemd-sysv-generator[160553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:55 compute-1 systemd[1]: Reloading.
Jan 27 21:07:55 compute-1 systemd-rc-local-generator[160585]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:07:55 compute-1 systemd-sysv-generator[160591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:07:55 compute-1 sudo[160521]: pam_unix(sudo:session): session closed for user root
Jan 27 21:07:56 compute-1 sshd-session[105799]: Connection closed by 192.168.122.30 port 42264
Jan 27 21:07:56 compute-1 sshd-session[105796]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:07:56 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Jan 27 21:07:56 compute-1 systemd[1]: session-24.scope: Consumed 3min 30.498s CPU time.
Jan 27 21:07:56 compute-1 systemd-logind[786]: Session 24 logged out. Waiting for processes to exit.
Jan 27 21:07:56 compute-1 systemd-logind[786]: Removed session 24.
Jan 27 21:08:01 compute-1 sshd-session[160619]: Accepted publickey for zuul from 192.168.122.30 port 44734 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 21:08:01 compute-1 systemd-logind[786]: New session 25 of user zuul.
Jan 27 21:08:01 compute-1 systemd[1]: Started Session 25 of User zuul.
Jan 27 21:08:01 compute-1 sshd-session[160619]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:08:02 compute-1 python3.9[160772]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:08:04 compute-1 python3.9[160926]: ansible-ansible.builtin.service_facts Invoked
Jan 27 21:08:04 compute-1 network[160943]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 27 21:08:04 compute-1 network[160944]: 'network-scripts' will be removed from distribution in near future.
Jan 27 21:08:04 compute-1 network[160945]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 27 21:08:10 compute-1 sudo[161214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aimrqtktosmjkgvskwrzkruoeubzeomd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548090.6120875-70-141066548063898/AnsiballZ_setup.py'
Jan 27 21:08:10 compute-1 sudo[161214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:08:11.142 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:08:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:08:11.143 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:08:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:08:11.143 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:08:11 compute-1 python3.9[161216]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 27 21:08:11 compute-1 sudo[161214]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:11 compute-1 podman[161226]: 2026-01-27 21:08:11.823218977 +0000 UTC m=+0.123451329 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260126, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:08:12 compute-1 sudo[161324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvxgsgbxfefcksxlnezrnmntfkfopbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548090.6120875-70-141066548063898/AnsiballZ_dnf.py'
Jan 27 21:08:12 compute-1 sudo[161324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:12 compute-1 python3.9[161326]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 21:08:15 compute-1 podman[161328]: 2026-01-27 21:08:15.743213097 +0000 UTC m=+0.061190351 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:08:17 compute-1 sudo[161324]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:18 compute-1 sudo[161497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucdgesxwbsfsbexajapcexzyltscxhfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548097.8451984-94-253869494439881/AnsiballZ_stat.py'
Jan 27 21:08:18 compute-1 sudo[161497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:18 compute-1 python3.9[161499]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:08:18 compute-1 sudo[161497]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:19 compute-1 sudo[161649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqhkjgympdcwmmxqncmkmehsstouvdst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548099.0986872-114-7862717002358/AnsiballZ_command.py'
Jan 27 21:08:19 compute-1 sudo[161649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:19 compute-1 python3.9[161651]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:08:19 compute-1 sudo[161649]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:20 compute-1 sudo[161802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqxapuzhxgprvstssgzoxpnrlawaigsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548100.1647604-134-31258562979444/AnsiballZ_stat.py'
Jan 27 21:08:20 compute-1 sudo[161802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:20 compute-1 python3.9[161804]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:08:20 compute-1 sudo[161802]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:21 compute-1 sudo[161954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnlpbhbfigibktjfmyrcewpvepymaszs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548100.9390996-150-193817297994430/AnsiballZ_command.py'
Jan 27 21:08:21 compute-1 sudo[161954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:21 compute-1 python3.9[161956]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:08:21 compute-1 sudo[161954]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:21 compute-1 sudo[162107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybwkiljtubmdpjuhpbkgeqdvwsfubasj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548101.6388419-166-255975986969070/AnsiballZ_stat.py'
Jan 27 21:08:21 compute-1 sudo[162107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:22 compute-1 python3.9[162109]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:08:22 compute-1 sudo[162107]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:22 compute-1 sudo[162230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhkhheeimckfmamwdvrucxallaxgtwzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548101.6388419-166-255975986969070/AnsiballZ_copy.py'
Jan 27 21:08:22 compute-1 sudo[162230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:22 compute-1 python3.9[162232]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548101.6388419-166-255975986969070/.source.iscsi _original_basename=.6harj8cw follow=False checksum=3927cec4a17890f85679453d4f9f449e1832f065 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:22 compute-1 sudo[162230]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:23 compute-1 sudo[162382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdrdvibujqycgrzxmokvpuhndihklifs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548103.052078-196-110421464143199/AnsiballZ_file.py'
Jan 27 21:08:23 compute-1 sudo[162382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:23 compute-1 python3.9[162384]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:23 compute-1 sudo[162382]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:24 compute-1 sudo[162534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwgenloufpbyqicrxyzuyvpdywaugano ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548103.9461086-212-196301304403643/AnsiballZ_lineinfile.py'
Jan 27 21:08:24 compute-1 sudo[162534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:24 compute-1 python3.9[162536]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:24 compute-1 sudo[162534]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:26 compute-1 sudo[162686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acvglgwhpecigarguooqkjugkelgbnge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548105.3980577-230-271799302246975/AnsiballZ_systemd_service.py'
Jan 27 21:08:26 compute-1 sudo[162686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:26 compute-1 python3.9[162688]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:08:26 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 27 21:08:26 compute-1 sudo[162686]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:28 compute-1 sudo[162842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewvattwsowsjmfsgvckjoztvwlvxuacd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548106.6590834-246-154430499126816/AnsiballZ_systemd_service.py'
Jan 27 21:08:28 compute-1 sudo[162842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:28 compute-1 python3.9[162844]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:08:28 compute-1 systemd[1]: Reloading.
Jan 27 21:08:28 compute-1 systemd-rc-local-generator[162871]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:08:28 compute-1 systemd-sysv-generator[162877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:08:28 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 27 21:08:28 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 27 21:08:28 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Jan 27 21:08:28 compute-1 systemd[1]: Started Open-iSCSI.
Jan 27 21:08:28 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 27 21:08:28 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 27 21:08:28 compute-1 sudo[162842]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:29 compute-1 python3.9[163043]: ansible-ansible.builtin.service_facts Invoked
Jan 27 21:08:29 compute-1 network[163060]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 27 21:08:29 compute-1 network[163061]: 'network-scripts' will be removed from distribution in near future.
Jan 27 21:08:29 compute-1 network[163062]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 27 21:08:34 compute-1 sudo[163331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odumcspletrgqzuoailofxaanjfaivyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548114.4211664-292-143008107508687/AnsiballZ_dnf.py'
Jan 27 21:08:34 compute-1 sudo[163331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:34 compute-1 python3.9[163333]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 21:08:37 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 21:08:37 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 21:08:37 compute-1 systemd[1]: Reloading.
Jan 27 21:08:37 compute-1 systemd-sysv-generator[163378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:08:37 compute-1 systemd-rc-local-generator[163375]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:08:37 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 21:08:38 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 21:08:38 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 21:08:38 compute-1 systemd[1]: run-re0e531408ba6489d82c0f019bc7f8618.service: Deactivated successfully.
Jan 27 21:08:38 compute-1 sudo[163331]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:39 compute-1 sudo[163647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngcuuphsqtrfmocfouuygtjyghjobibn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548118.8600721-310-80227881531285/AnsiballZ_file.py'
Jan 27 21:08:39 compute-1 sudo[163647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:39 compute-1 python3.9[163649]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 27 21:08:39 compute-1 sudo[163647]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:40 compute-1 sudo[163799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpebxldsvdlgsgabogkpvarhsuiftzqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548119.6382961-326-256661732100760/AnsiballZ_modprobe.py'
Jan 27 21:08:40 compute-1 sudo[163799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:40 compute-1 python3.9[163801]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 27 21:08:40 compute-1 sudo[163799]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:40 compute-1 sudo[163955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzwgiykhbzkgiujhqsvvqwzjrkcaxzyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548120.4881988-342-142926736919791/AnsiballZ_stat.py'
Jan 27 21:08:40 compute-1 sudo[163955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:40 compute-1 python3.9[163957]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:08:40 compute-1 sudo[163955]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:41 compute-1 systemd[1]: Starting dnf makecache...
Jan 27 21:08:41 compute-1 sudo[164079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aniwnggskwghgjuwumotpgbvlclqizby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548120.4881988-342-142926736919791/AnsiballZ_copy.py'
Jan 27 21:08:41 compute-1 sudo[164079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:41 compute-1 dnf[164052]: Repository 'gating-repo' is missing name in configuration, using id.
Jan 27 21:08:41 compute-1 dnf[164052]: Metadata cache refreshed recently.
Jan 27 21:08:41 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 27 21:08:41 compute-1 systemd[1]: Finished dnf makecache.
Jan 27 21:08:41 compute-1 python3.9[164081]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548120.4881988-342-142926736919791/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:41 compute-1 sudo[164079]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:42 compute-1 sudo[164246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twkfpgvksjclthlqbrtmqzowrlvvpanc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548121.932469-374-217981034374207/AnsiballZ_lineinfile.py'
Jan 27 21:08:42 compute-1 sudo[164246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:42 compute-1 podman[164205]: 2026-01-27 21:08:42.338726061 +0000 UTC m=+0.116050579 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:08:42 compute-1 python3.9[164253]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:42 compute-1 sudo[164246]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:43 compute-1 sudo[164409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orgcdiltdsympnzlotlfqpbyxnaxfnfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548122.7476747-390-250778669339277/AnsiballZ_systemd.py'
Jan 27 21:08:43 compute-1 sudo[164409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:43 compute-1 python3.9[164411]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:08:43 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 27 21:08:43 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 27 21:08:43 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 27 21:08:43 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 27 21:08:43 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 27 21:08:43 compute-1 sudo[164409]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:44 compute-1 sudo[164565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otxlaxzddzdfhqhcfepnanveuntyuvar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548123.967515-406-193745447185716/AnsiballZ_command.py'
Jan 27 21:08:44 compute-1 sudo[164565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:44 compute-1 python3.9[164567]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:08:44 compute-1 sudo[164565]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:45 compute-1 sudo[164718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ongfkubsjjmdaystwwbceiwfuwuztdfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548124.8651505-426-158544777229327/AnsiballZ_stat.py'
Jan 27 21:08:45 compute-1 sudo[164718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:45 compute-1 python3.9[164720]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:08:45 compute-1 sudo[164718]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:46 compute-1 sudo[164880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbicmyrghvqmvnnyflgjvduwjfalpxwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548125.646932-444-249723653491148/AnsiballZ_stat.py'
Jan 27 21:08:46 compute-1 sudo[164880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:46 compute-1 podman[164844]: 2026-01-27 21:08:46.031340857 +0000 UTC m=+0.084025112 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:08:46 compute-1 python3.9[164889]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:08:46 compute-1 sudo[164880]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:46 compute-1 sudo[165012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjhjoidroowebvppqjpiinkijpqoonsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548125.646932-444-249723653491148/AnsiballZ_copy.py'
Jan 27 21:08:46 compute-1 sudo[165012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:46 compute-1 python3.9[165014]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548125.646932-444-249723653491148/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:46 compute-1 sudo[165012]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:47 compute-1 sudo[165164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqziqormuzdqenqyxlnijjmfvkrliciz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548127.1133215-474-236621407385539/AnsiballZ_command.py'
Jan 27 21:08:47 compute-1 sudo[165164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:47 compute-1 python3.9[165166]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:08:47 compute-1 sudo[165164]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:48 compute-1 sudo[165317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpycdgbudnvufvwmdmnhbojnctekyero ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548127.9048603-490-237909228260172/AnsiballZ_lineinfile.py'
Jan 27 21:08:48 compute-1 sudo[165317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:49 compute-1 python3.9[165319]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:49 compute-1 sudo[165317]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:49 compute-1 sudo[165469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqznytygxrgqkilfqyotocfgztbsiypk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548129.4102159-506-269622434504029/AnsiballZ_replace.py'
Jan 27 21:08:49 compute-1 sudo[165469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:50 compute-1 python3.9[165471]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:50 compute-1 sudo[165469]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:51 compute-1 sudo[165621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxaxdpwqlocdkkaikbfxvtgulwybstpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548130.6869586-522-160334193876089/AnsiballZ_replace.py'
Jan 27 21:08:51 compute-1 sudo[165621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:51 compute-1 python3.9[165623]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:51 compute-1 sudo[165621]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:51 compute-1 sudo[165773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-troajvragetxepgszdwcmwwpsmhtandj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548131.567544-540-65298405395030/AnsiballZ_lineinfile.py'
Jan 27 21:08:51 compute-1 sudo[165773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:52 compute-1 python3.9[165775]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:52 compute-1 sudo[165773]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:52 compute-1 sudo[165925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnugpjmphbwghaizhjtagvmfcquiuowy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548132.2232242-540-181478813776213/AnsiballZ_lineinfile.py'
Jan 27 21:08:52 compute-1 sudo[165925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:52 compute-1 python3.9[165927]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:52 compute-1 sudo[165925]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:53 compute-1 sudo[166077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlzbtauuqoutjzftpsvfncuhwhudnvoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548132.9008684-540-245107029924293/AnsiballZ_lineinfile.py'
Jan 27 21:08:53 compute-1 sudo[166077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:53 compute-1 python3.9[166079]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:53 compute-1 sudo[166077]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:53 compute-1 sudo[166229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvbbwgqexpdteexppcfprjpvzxcupjxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548133.6102533-540-108312828084928/AnsiballZ_lineinfile.py'
Jan 27 21:08:53 compute-1 sudo[166229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:54 compute-1 python3.9[166231]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:08:54 compute-1 sudo[166229]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:54 compute-1 sudo[166381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwiaenugvfghjogqzbvfszuxfttnfwlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548134.324708-598-134151676673775/AnsiballZ_stat.py'
Jan 27 21:08:54 compute-1 sudo[166381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:54 compute-1 python3.9[166383]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:08:54 compute-1 sudo[166381]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:55 compute-1 sudo[166535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqdnbactmzyivoiefkqzyzfbopzjbcru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548135.1270974-614-15771939482210/AnsiballZ_command.py'
Jan 27 21:08:55 compute-1 sudo[166535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:55 compute-1 python3.9[166537]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:08:55 compute-1 sudo[166535]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:56 compute-1 sudo[166688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loqhzqpodgxrvxgivhoeslkfpqnrpemq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548136.0105395-632-7719602468835/AnsiballZ_systemd_service.py'
Jan 27 21:08:56 compute-1 sudo[166688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:56 compute-1 python3.9[166690]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:08:56 compute-1 systemd[1]: Listening on multipathd control socket.
Jan 27 21:08:56 compute-1 sudo[166688]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:57 compute-1 sudo[166844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hllhpgtgjbetfuqqvzophvimldqagpsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548136.9473443-648-95183732808697/AnsiballZ_systemd_service.py'
Jan 27 21:08:57 compute-1 sudo[166844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:57 compute-1 python3.9[166846]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:08:57 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 27 21:08:57 compute-1 udevadm[166851]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 27 21:08:57 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 27 21:08:57 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 27 21:08:57 compute-1 multipathd[166854]: --------start up--------
Jan 27 21:08:57 compute-1 multipathd[166854]: read /etc/multipath.conf
Jan 27 21:08:57 compute-1 multipathd[166854]: path checkers start up
Jan 27 21:08:57 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 27 21:08:57 compute-1 sudo[166844]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:58 compute-1 sudo[167011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrdnwjcjlkgxerblifycfuczmmlubqdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548138.3288853-672-204772394604581/AnsiballZ_file.py'
Jan 27 21:08:58 compute-1 sudo[167011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:58 compute-1 python3.9[167013]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 27 21:08:58 compute-1 sudo[167011]: pam_unix(sudo:session): session closed for user root
Jan 27 21:08:59 compute-1 sudo[167163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iamsbxwjkyarvzmevsfgunmypdrbymlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548139.08762-688-160177422664908/AnsiballZ_modprobe.py'
Jan 27 21:08:59 compute-1 sudo[167163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:08:59 compute-1 python3.9[167165]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 27 21:08:59 compute-1 kernel: Key type psk registered
Jan 27 21:08:59 compute-1 sudo[167163]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:00 compute-1 sudo[167327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmansiqblmnravaeusuywmxydtccvjas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548139.887472-704-245332837008741/AnsiballZ_stat.py'
Jan 27 21:09:00 compute-1 sudo[167327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:00 compute-1 python3.9[167329]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:09:00 compute-1 sudo[167327]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:00 compute-1 sudo[167450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abkzbocqncaoeqgndyxtepvmwlquvlmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548139.887472-704-245332837008741/AnsiballZ_copy.py'
Jan 27 21:09:00 compute-1 sudo[167450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:01 compute-1 python3.9[167452]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548139.887472-704-245332837008741/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:01 compute-1 sudo[167450]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:01 compute-1 sudo[167602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqjaunrkhmuvsashvuivxvyexhifsscd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548141.3671398-737-197195751549340/AnsiballZ_lineinfile.py'
Jan 27 21:09:01 compute-1 sudo[167602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:01 compute-1 python3.9[167604]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:01 compute-1 sudo[167602]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:02 compute-1 sudo[167756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekjoxlbhdmszvzepqdrcxzhasqrsktsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548142.12298-752-195863533770464/AnsiballZ_systemd.py'
Jan 27 21:09:02 compute-1 sudo[167756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:02 compute-1 sshd-session[167605]: Invalid user solana from 80.94.92.186 port 37128
Jan 27 21:09:02 compute-1 python3.9[167758]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:09:02 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 27 21:09:02 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 27 21:09:02 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 27 21:09:02 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 27 21:09:02 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 27 21:09:02 compute-1 sudo[167756]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:02 compute-1 sshd-session[167605]: Connection closed by invalid user solana 80.94.92.186 port 37128 [preauth]
Jan 27 21:09:03 compute-1 sudo[167912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppmvhjvcbrmtfvuzjlllmawckjvwslaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548143.0361936-768-43068588008898/AnsiballZ_dnf.py'
Jan 27 21:09:03 compute-1 sudo[167912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:03 compute-1 python3.9[167914]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 27 21:09:05 compute-1 systemd[1]: Reloading.
Jan 27 21:09:06 compute-1 systemd-rc-local-generator[167947]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:09:06 compute-1 systemd-sysv-generator[167951]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:09:06 compute-1 systemd[1]: Reloading.
Jan 27 21:09:06 compute-1 systemd-rc-local-generator[167983]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:09:06 compute-1 systemd-sysv-generator[167986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:09:06 compute-1 systemd-logind[786]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 27 21:09:06 compute-1 systemd-logind[786]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 27 21:09:06 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 27 21:09:06 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 27 21:09:06 compute-1 systemd[1]: Reloading.
Jan 27 21:09:07 compute-1 systemd-rc-local-generator[168080]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:09:07 compute-1 systemd-sysv-generator[168083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:09:07 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 27 21:09:07 compute-1 sudo[167912]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:08 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 27 21:09:08 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 27 21:09:08 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.671s CPU time.
Jan 27 21:09:08 compute-1 systemd[1]: run-rff661068398e4973946f1bdfb31ab953.service: Deactivated successfully.
Jan 27 21:09:08 compute-1 sudo[169378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxzjnhhdnnonejubrkqymowernjmohek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548148.2496142-784-148744305911759/AnsiballZ_systemd_service.py'
Jan 27 21:09:08 compute-1 sudo[169378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:08 compute-1 python3.9[169380]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:09:08 compute-1 systemd[1]: Stopping Open-iSCSI...
Jan 27 21:09:09 compute-1 iscsid[162885]: iscsid shutting down.
Jan 27 21:09:09 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Jan 27 21:09:09 compute-1 systemd[1]: Stopped Open-iSCSI.
Jan 27 21:09:09 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 27 21:09:09 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 27 21:09:09 compute-1 systemd[1]: Started Open-iSCSI.
Jan 27 21:09:09 compute-1 sudo[169378]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:09 compute-1 sudo[169534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocxiupwfcopbvumsupibfxlszdeafxbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548149.4041693-800-8238876890212/AnsiballZ_systemd_service.py'
Jan 27 21:09:09 compute-1 sudo[169534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:10 compute-1 python3.9[169536]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:09:10 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 27 21:09:10 compute-1 multipathd[166854]: exit (signal)
Jan 27 21:09:10 compute-1 multipathd[166854]: --------shut down-------
Jan 27 21:09:10 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Jan 27 21:09:10 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 27 21:09:10 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 27 21:09:10 compute-1 multipathd[169542]: --------start up--------
Jan 27 21:09:10 compute-1 multipathd[169542]: read /etc/multipath.conf
Jan 27 21:09:10 compute-1 multipathd[169542]: path checkers start up
Jan 27 21:09:10 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 27 21:09:10 compute-1 sudo[169534]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:09:11.145 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:09:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:09:11.146 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:09:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:09:11.147 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:09:11 compute-1 python3.9[169700]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:09:12 compute-1 sudo[169865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dizpncluujhxdmzvgmcjfjuyckgfxtpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548152.1100695-835-243168754531326/AnsiballZ_file.py'
Jan 27 21:09:12 compute-1 sudo[169865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:12 compute-1 podman[169828]: 2026-01-27 21:09:12.521252526 +0000 UTC m=+0.116087198 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260126)
Jan 27 21:09:12 compute-1 python3.9[169872]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:12 compute-1 sudo[169865]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:13 compute-1 sudo[170032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfxgwusqajofppxiclzekcqtfmhfscml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548153.117414-857-246095423841533/AnsiballZ_systemd_service.py'
Jan 27 21:09:13 compute-1 sudo[170032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:13 compute-1 python3.9[170034]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:09:13 compute-1 systemd[1]: Reloading.
Jan 27 21:09:13 compute-1 systemd-sysv-generator[170066]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:09:13 compute-1 systemd-rc-local-generator[170061]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:09:14 compute-1 sudo[170032]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:14 compute-1 python3.9[170219]: ansible-ansible.builtin.service_facts Invoked
Jan 27 21:09:14 compute-1 network[170236]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 27 21:09:14 compute-1 network[170237]: 'network-scripts' will be removed from distribution in near future.
Jan 27 21:09:14 compute-1 network[170238]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 27 21:09:16 compute-1 podman[170265]: 2026-01-27 21:09:16.157058286 +0000 UTC m=+0.072049992 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:09:19 compute-1 sudo[170528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jewliwlwlvlecbhsqhpakvyqnofvtddf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548159.283445-895-178755921629143/AnsiballZ_systemd_service.py'
Jan 27 21:09:19 compute-1 sudo[170528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:19 compute-1 python3.9[170530]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:19 compute-1 sudo[170528]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:21 compute-1 sudo[170681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpdlwmnzrsunldyzggleukmoajqfxpyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548160.0921972-895-76507371369524/AnsiballZ_systemd_service.py'
Jan 27 21:09:21 compute-1 sudo[170681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:21 compute-1 python3.9[170683]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:21 compute-1 sudo[170681]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:21 compute-1 sudo[170834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gybtugfxiovelqtmafigblqonrelhbas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548161.563695-895-214334919087663/AnsiballZ_systemd_service.py'
Jan 27 21:09:21 compute-1 sudo[170834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:22 compute-1 python3.9[170836]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:22 compute-1 sudo[170834]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:22 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 27 21:09:22 compute-1 sudo[170988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpgvhttfcbiqesrepcghuecbglivismc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548162.3510811-895-121835224897587/AnsiballZ_systemd_service.py'
Jan 27 21:09:22 compute-1 sudo[170988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:22 compute-1 python3.9[170990]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:22 compute-1 sudo[170988]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:23 compute-1 sudo[171141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoingcxhlavqlpawdvlvtawmwfxjkzfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548163.0694954-895-229681640568325/AnsiballZ_systemd_service.py'
Jan 27 21:09:23 compute-1 sudo[171141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:23 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 27 21:09:23 compute-1 python3.9[171143]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:23 compute-1 sudo[171141]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:24 compute-1 sudo[171295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcavwmsodjtuhcnycxlncjqrnfwqxvos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548163.92685-895-100158520060825/AnsiballZ_systemd_service.py'
Jan 27 21:09:24 compute-1 sudo[171295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:24 compute-1 python3.9[171297]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:24 compute-1 sudo[171295]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:24 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 27 21:09:24 compute-1 sudo[171449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgfdlqfnkjvjxtkcpqtlmeknpupoloxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548164.642935-895-191367436729430/AnsiballZ_systemd_service.py'
Jan 27 21:09:24 compute-1 sudo[171449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:25 compute-1 python3.9[171451]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:25 compute-1 sudo[171449]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:25 compute-1 sudo[171602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohynuroeboxcudlylkckeyaikupczzpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548165.4451072-895-123838890334090/AnsiballZ_systemd_service.py'
Jan 27 21:09:25 compute-1 sudo[171602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:25 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 27 21:09:26 compute-1 python3.9[171605]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:09:26 compute-1 sudo[171602]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:26 compute-1 sudo[171756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkzbikasepddaqvxjldtwltmatmdnbfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548166.5721817-1013-172619366354924/AnsiballZ_file.py'
Jan 27 21:09:26 compute-1 sudo[171756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:27 compute-1 python3.9[171758]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:27 compute-1 sudo[171756]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:27 compute-1 sudo[171908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftirkrblkyponjwftjnzruxelghydyyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548167.2392406-1013-166299216811478/AnsiballZ_file.py'
Jan 27 21:09:27 compute-1 sudo[171908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:27 compute-1 python3.9[171910]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:27 compute-1 sudo[171908]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:28 compute-1 sudo[172060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhmnwfbbebvchtchohbnrigzsnimtrqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548167.8622074-1013-273067740985116/AnsiballZ_file.py'
Jan 27 21:09:28 compute-1 sudo[172060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:28 compute-1 python3.9[172062]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:28 compute-1 sudo[172060]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:29 compute-1 sudo[172212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgjvdszgztwrlbctfnhmbrewmgorubkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548168.9507139-1013-249610573527296/AnsiballZ_file.py'
Jan 27 21:09:29 compute-1 sudo[172212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:29 compute-1 python3.9[172214]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:29 compute-1 sudo[172212]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:30 compute-1 sudo[172364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njkxdwtwjwlmivsymsioepubvgyhdxov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548170.0846946-1013-138505715778716/AnsiballZ_file.py'
Jan 27 21:09:30 compute-1 sudo[172364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:30 compute-1 python3.9[172366]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:30 compute-1 sudo[172364]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:31 compute-1 sudo[172516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ampawskqugznmeliebstrthmacjrchze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548170.711058-1013-221452942370119/AnsiballZ_file.py'
Jan 27 21:09:31 compute-1 sudo[172516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:31 compute-1 python3.9[172518]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:31 compute-1 sudo[172516]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:31 compute-1 sudo[172668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcykhggubvuvkdiwrzqlvbixfnqtvosr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548171.388117-1013-218266855547617/AnsiballZ_file.py'
Jan 27 21:09:31 compute-1 sudo[172668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:31 compute-1 python3.9[172670]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:31 compute-1 sudo[172668]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:32 compute-1 sudo[172820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxfxsktsdrkfuyckhsqpflwbyshqgrsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548172.0272725-1013-196777155044533/AnsiballZ_file.py'
Jan 27 21:09:32 compute-1 sudo[172820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:32 compute-1 python3.9[172822]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:32 compute-1 sudo[172820]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:33 compute-1 sudo[172972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpzwnvqdwfqxbdknoomesmqdxmjmbpin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548172.781647-1127-131859047996844/AnsiballZ_file.py'
Jan 27 21:09:33 compute-1 sudo[172972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:33 compute-1 python3.9[172974]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:33 compute-1 sudo[172972]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:33 compute-1 sudo[173124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmhqbbsnkjrhpaiwmoiofbzcxapejgpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548173.4682562-1127-28876277120706/AnsiballZ_file.py'
Jan 27 21:09:33 compute-1 sudo[173124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:33 compute-1 python3.9[173126]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:33 compute-1 sudo[173124]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:34 compute-1 sudo[173276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omdfoodbbekdsgwuklqycrljdhpzoetp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548174.1425858-1127-251584934163771/AnsiballZ_file.py'
Jan 27 21:09:34 compute-1 sudo[173276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:34 compute-1 python3.9[173278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:34 compute-1 sudo[173276]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:35 compute-1 sudo[173428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwddttmbwultgchtzfjdtqhgqmrelhyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548174.7734559-1127-104394817938822/AnsiballZ_file.py'
Jan 27 21:09:35 compute-1 sudo[173428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:35 compute-1 python3.9[173430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:35 compute-1 sudo[173428]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:35 compute-1 sudo[173580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mipttdggexhyiaftahczrbuovqxnftes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548175.4836638-1127-253140926933133/AnsiballZ_file.py'
Jan 27 21:09:35 compute-1 sudo[173580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:36 compute-1 python3.9[173582]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:36 compute-1 sudo[173580]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:36 compute-1 sudo[173732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeivyygtrbwounstewwenbhrszdngarj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548176.2295-1127-41614738138878/AnsiballZ_file.py'
Jan 27 21:09:36 compute-1 sudo[173732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:36 compute-1 python3.9[173734]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:36 compute-1 sudo[173732]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:37 compute-1 sudo[173884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggmycmnouglkaqwkyzqakovttfpvbsbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548176.8649325-1127-261279371436159/AnsiballZ_file.py'
Jan 27 21:09:37 compute-1 sudo[173884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:37 compute-1 python3.9[173886]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:37 compute-1 sudo[173884]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:37 compute-1 sudo[174036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byijkpmlxmpgwaifxtuatozwhfdqdorj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548177.546939-1127-6144581755807/AnsiballZ_file.py'
Jan 27 21:09:37 compute-1 sudo[174036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:38 compute-1 python3.9[174038]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:09:38 compute-1 sudo[174036]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:39 compute-1 sudo[174188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raawhmpthfrwvafihveyzvptjcdtrrsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548179.0634105-1243-158875851533249/AnsiballZ_command.py'
Jan 27 21:09:39 compute-1 sudo[174188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:39 compute-1 python3.9[174190]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:39 compute-1 sudo[174188]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:40 compute-1 python3.9[174342]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 27 21:09:41 compute-1 sudo[174492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqbzlcbkcvhpvnmcdulwwqbgkcyftizb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548181.225703-1279-264037508468480/AnsiballZ_systemd_service.py'
Jan 27 21:09:41 compute-1 sudo[174492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:41 compute-1 python3.9[174494]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:09:41 compute-1 systemd[1]: Reloading.
Jan 27 21:09:41 compute-1 systemd-rc-local-generator[174521]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:09:41 compute-1 systemd-sysv-generator[174524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:09:42 compute-1 sudo[174492]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:42 compute-1 sudo[174697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvbvfofkfcgwxuoylhwlyksrwyafyppb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548182.3706238-1295-205710121582089/AnsiballZ_command.py'
Jan 27 21:09:42 compute-1 sudo[174697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:42 compute-1 podman[174653]: 2026-01-27 21:09:42.801072412 +0000 UTC m=+0.175592019 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true)
Jan 27 21:09:42 compute-1 python3.9[174705]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:42 compute-1 sudo[174697]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:43 compute-1 sudo[174860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enegtyzhkjajipftcknlvuwsshidsumj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548183.05758-1295-265874723888225/AnsiballZ_command.py'
Jan 27 21:09:43 compute-1 sudo[174860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:43 compute-1 python3.9[174862]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:43 compute-1 sudo[174860]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:44 compute-1 sudo[175013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yffxghodaxocijeciczvgvgcpsjwsrtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548183.7128782-1295-156699489373609/AnsiballZ_command.py'
Jan 27 21:09:44 compute-1 sudo[175013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:44 compute-1 python3.9[175015]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:44 compute-1 sudo[175013]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:44 compute-1 sudo[175166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vululyacktvrbdjmsjhblngyosspucpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548184.4058826-1295-85024545946031/AnsiballZ_command.py'
Jan 27 21:09:44 compute-1 sudo[175166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:44 compute-1 python3.9[175168]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:44 compute-1 sudo[175166]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:45 compute-1 sudo[175319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtacviohekvaabuwktbymimuqibzlzuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548185.1129327-1295-216567822353028/AnsiballZ_command.py'
Jan 27 21:09:45 compute-1 sudo[175319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:45 compute-1 python3.9[175321]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:45 compute-1 sudo[175319]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:46 compute-1 sudo[175472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nughcvtzadebpdfjftyixpfvlcsqnidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548185.7618985-1295-43401932375784/AnsiballZ_command.py'
Jan 27 21:09:46 compute-1 sudo[175472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:46 compute-1 python3.9[175474]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:46 compute-1 sudo[175472]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:46 compute-1 podman[175476]: 2026-01-27 21:09:46.375135821 +0000 UTC m=+0.055850907 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126)
Jan 27 21:09:46 compute-1 sudo[175644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aombwoczcmiwihsqbuhtorqmbgaxewqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548186.4600444-1295-24688003895021/AnsiballZ_command.py'
Jan 27 21:09:46 compute-1 sudo[175644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:46 compute-1 python3.9[175646]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:47 compute-1 sudo[175644]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:47 compute-1 sudo[175797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yycpimicibscuiadklmixwffloudyrvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548187.2142036-1295-212315882436408/AnsiballZ_command.py'
Jan 27 21:09:47 compute-1 sudo[175797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:47 compute-1 python3.9[175799]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:09:47 compute-1 sudo[175797]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:49 compute-1 sudo[175950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prptzzspwodghkdgvqtdqgsmvvkxnnpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548188.8518686-1438-78350722862728/AnsiballZ_file.py'
Jan 27 21:09:49 compute-1 sudo[175950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:49 compute-1 python3.9[175952]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:49 compute-1 sudo[175950]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:49 compute-1 sudo[176102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbgtoadoahhaxmlbnojljjwujadmibwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548189.6214645-1438-173429829738567/AnsiballZ_file.py'
Jan 27 21:09:49 compute-1 sudo[176102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:50 compute-1 python3.9[176104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:50 compute-1 sudo[176102]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:51 compute-1 sudo[176254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvsmjnhjyduchxehsthddnyckhrrtiyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548191.0691397-1438-255004268304809/AnsiballZ_file.py'
Jan 27 21:09:51 compute-1 sudo[176254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:51 compute-1 python3.9[176256]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:51 compute-1 sudo[176254]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:52 compute-1 sudo[176406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jecptqvzhumtdegjzopnomytzwersgdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548191.7690005-1482-270638464952402/AnsiballZ_file.py'
Jan 27 21:09:52 compute-1 sudo[176406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:52 compute-1 python3.9[176408]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:52 compute-1 sudo[176406]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:52 compute-1 sudo[176558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oegikllsvfkxxfwgdzxyncdfeypwipha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548192.5596547-1482-197726671805944/AnsiballZ_file.py'
Jan 27 21:09:52 compute-1 sudo[176558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:53 compute-1 python3.9[176560]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:53 compute-1 sudo[176558]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:53 compute-1 sudo[176710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owjhjaortkndpmwkjoaawstuvnzihioc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548193.273108-1482-149109820480063/AnsiballZ_file.py'
Jan 27 21:09:53 compute-1 sudo[176710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:53 compute-1 python3.9[176712]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:53 compute-1 sudo[176710]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:54 compute-1 sudo[176862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uflawqtezcydryefzlatzxeqitjmszbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548193.8710818-1482-235215061593561/AnsiballZ_file.py'
Jan 27 21:09:54 compute-1 sudo[176862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:54 compute-1 python3.9[176864]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:54 compute-1 sudo[176862]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:55 compute-1 sudo[177014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txtrnqxesxrhwaxxohopnsmaiitgsfbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548195.1837547-1482-199631193998555/AnsiballZ_file.py'
Jan 27 21:09:55 compute-1 sudo[177014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:55 compute-1 python3.9[177016]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:55 compute-1 sudo[177014]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:56 compute-1 sudo[177166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yttzeruxchmlevrtbpxihxbmwkttbpzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548195.946527-1482-112262475689966/AnsiballZ_file.py'
Jan 27 21:09:56 compute-1 sudo[177166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:56 compute-1 python3.9[177168]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:56 compute-1 sudo[177166]: pam_unix(sudo:session): session closed for user root
Jan 27 21:09:56 compute-1 sudo[177318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moqlbtckwhphhimqanlghsionlwfpval ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548196.6641972-1482-147257838631578/AnsiballZ_file.py'
Jan 27 21:09:56 compute-1 sudo[177318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:09:57 compute-1 python3.9[177320]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:09:57 compute-1 sudo[177318]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:02 compute-1 sudo[177470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzofyhvditcwpoafwquecgyelveojsyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548201.9839191-1719-95735951819381/AnsiballZ_getent.py'
Jan 27 21:10:02 compute-1 sudo[177470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:02 compute-1 python3.9[177472]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 27 21:10:02 compute-1 sudo[177470]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:03 compute-1 sudo[177623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhcdmbblplzdbqydskinsrlrrbcrzvqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548202.9987423-1735-23267682008486/AnsiballZ_group.py'
Jan 27 21:10:03 compute-1 sudo[177623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:03 compute-1 python3.9[177625]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 27 21:10:03 compute-1 groupadd[177626]: group added to /etc/group: name=nova, GID=42436
Jan 27 21:10:03 compute-1 groupadd[177626]: group added to /etc/gshadow: name=nova
Jan 27 21:10:03 compute-1 groupadd[177626]: new group: name=nova, GID=42436
Jan 27 21:10:03 compute-1 sudo[177623]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:04 compute-1 sudo[177781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnpjeimxpeudufslsegtsxjuexysrhfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548204.1237683-1751-115704968081778/AnsiballZ_user.py'
Jan 27 21:10:04 compute-1 sudo[177781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:04 compute-1 python3.9[177783]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 27 21:10:04 compute-1 useradd[177785]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 27 21:10:04 compute-1 useradd[177785]: add 'nova' to group 'libvirt'
Jan 27 21:10:04 compute-1 useradd[177785]: add 'nova' to shadow group 'libvirt'
Jan 27 21:10:05 compute-1 sudo[177781]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:06 compute-1 sshd-session[177816]: Accepted publickey for zuul from 192.168.122.30 port 39168 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 21:10:06 compute-1 systemd-logind[786]: New session 26 of user zuul.
Jan 27 21:10:06 compute-1 systemd[1]: Started Session 26 of User zuul.
Jan 27 21:10:06 compute-1 sshd-session[177816]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:10:06 compute-1 sshd-session[177819]: Received disconnect from 192.168.122.30 port 39168:11: disconnected by user
Jan 27 21:10:06 compute-1 sshd-session[177819]: Disconnected from user zuul 192.168.122.30 port 39168
Jan 27 21:10:06 compute-1 sshd-session[177816]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:10:06 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Jan 27 21:10:06 compute-1 systemd-logind[786]: Session 26 logged out. Waiting for processes to exit.
Jan 27 21:10:06 compute-1 systemd-logind[786]: Removed session 26.
Jan 27 21:10:06 compute-1 python3.9[177969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:07 compute-1 python3.9[178090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548206.37733-1801-6158400997777/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:08 compute-1 python3.9[178240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:08 compute-1 python3.9[178316]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:09 compute-1 python3.9[178466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:09 compute-1 python3.9[178587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548208.7842264-1801-166038156597992/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:10 compute-1 python3.9[178737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:10:11.148 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:10:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:10:11.149 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:10:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:10:11.149 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:10:11 compute-1 python3.9[178859]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548210.0116603-1801-126227954450051/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:12 compute-1 python3.9[179009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:12 compute-1 python3.9[179130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548211.7616386-1801-182991417872833/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:13 compute-1 podman[179131]: 2026-01-27 21:10:13.01875109 +0000 UTC m=+0.100510459 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Jan 27 21:10:13 compute-1 python3.9[179306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:14 compute-1 python3.9[179427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548213.0496018-1801-120751133121444/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:14 compute-1 sudo[179577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaphpownmzxeapjxuenpqbqpathxxnmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548214.4003227-1967-7832877860085/AnsiballZ_file.py'
Jan 27 21:10:14 compute-1 sudo[179577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:14 compute-1 python3.9[179579]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:10:14 compute-1 sudo[179577]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:15 compute-1 sudo[179729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yafyhoobifzvswfbffwbtrcbquwhllmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548215.124499-1983-85331372243479/AnsiballZ_copy.py'
Jan 27 21:10:15 compute-1 sudo[179729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:15 compute-1 python3.9[179731]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:10:15 compute-1 sudo[179729]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:16 compute-1 sudo[179881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzpuqwlrtlqtxsnyjkmtdmbwgayouzyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548215.7948875-1999-88450433363000/AnsiballZ_stat.py'
Jan 27 21:10:16 compute-1 sudo[179881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:16 compute-1 python3.9[179883]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:10:16 compute-1 sudo[179881]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:16 compute-1 podman[179983]: 2026-01-27 21:10:16.753810768 +0000 UTC m=+0.056677992 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 27 21:10:16 compute-1 sudo[180051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwkxqkmmzicarxqwqdcqpsjtbeyplaqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548216.532156-2015-85947243376728/AnsiballZ_stat.py'
Jan 27 21:10:16 compute-1 sudo[180051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:17 compute-1 python3.9[180053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:17 compute-1 sudo[180051]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:17 compute-1 sudo[180174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixzkzyiinjndkskmvekpmhbuueemhvdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548216.532156-2015-85947243376728/AnsiballZ_copy.py'
Jan 27 21:10:17 compute-1 sudo[180174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:17 compute-1 python3.9[180176]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769548216.532156-2015-85947243376728/.source _original_basename=.15y7mzm7 follow=False checksum=02d7dda6705e7c517c57f2e4e8cdbd219564681e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 27 21:10:17 compute-1 sudo[180174]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:18 compute-1 python3.9[180328]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:10:19 compute-1 python3.9[180480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:19 compute-1 python3.9[180601]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548218.8659275-2067-264664689779821/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=59122c2bee6bdb86202aee6ed9b49e5054f6e4a7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:20 compute-1 python3.9[180751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:10:21 compute-1 python3.9[180872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548220.1484652-2097-19828409856352/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=dd30a798456f209678ffb83ce1e64801d570ba0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:10:22 compute-1 sudo[181022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbhawuonxbishxdvauncggxumlgmhjjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548221.6963804-2131-273628958418819/AnsiballZ_container_config_data.py'
Jan 27 21:10:22 compute-1 sudo[181022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:22 compute-1 python3.9[181024]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 27 21:10:22 compute-1 sudo[181022]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:23 compute-1 sudo[181174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dihkanfywsfjzybxyvhvxuzjhstlbrmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548223.4993227-2153-62065246050880/AnsiballZ_container_config_hash.py'
Jan 27 21:10:23 compute-1 sudo[181174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:24 compute-1 python3.9[181176]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 27 21:10:24 compute-1 sudo[181174]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:25 compute-1 sudo[181326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxjtoxhuudbjbujmgkqdiyxvrudyyoil ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769548225.1761062-2173-49903619176732/AnsiballZ_edpm_container_manage.py'
Jan 27 21:10:25 compute-1 sudo[181326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:25 compute-1 python3[181328]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 27 21:10:26 compute-1 podman[181363]: 2026-01-27 21:10:26.226764635 +0000 UTC m=+0.082282747 container create 9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=nova_compute_init, io.buildah.version=1.41.4)
Jan 27 21:10:26 compute-1 podman[181363]: 2026-01-27 21:10:26.188045368 +0000 UTC m=+0.043563560 image pull a5aa004c3a6db392cb04fafa2aacae4b2b1bb5836e3e54d23b692771193184c9 38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 27 21:10:26 compute-1 python3[181328]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 27 21:10:26 compute-1 sudo[181326]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:26 compute-1 sudo[181549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neeqzmmzreexlgtulxdgebflmgoautqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548226.5933049-2190-156643426694406/AnsiballZ_stat.py'
Jan 27 21:10:26 compute-1 sudo[181549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:27 compute-1 python3.9[181551]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:10:27 compute-1 sudo[181549]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:28 compute-1 sudo[181703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvjvzgfxjvmmxelwhutwoxvsnwnuehfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548227.9849622-2213-108607636140760/AnsiballZ_container_config_data.py'
Jan 27 21:10:28 compute-1 sudo[181703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:28 compute-1 python3.9[181705]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 27 21:10:28 compute-1 sudo[181703]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:29 compute-1 sudo[181855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzhmxiqtpuvsbgkkppagavoeblazmgpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548228.841629-2235-71055436852500/AnsiballZ_container_config_hash.py'
Jan 27 21:10:29 compute-1 sudo[181855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:29 compute-1 python3.9[181857]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 27 21:10:29 compute-1 sudo[181855]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:30 compute-1 sudo[182007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iolbgtrtvhbfjstnjxkarpcluezxavyu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769548229.7425296-2255-171886300222836/AnsiballZ_edpm_container_manage.py'
Jan 27 21:10:30 compute-1 sudo[182007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:30 compute-1 python3[182009]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 27 21:10:30 compute-1 podman[182045]: 2026-01-27 21:10:30.535869181 +0000 UTC m=+0.058255580 container create 9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, container_name=nova_compute, io.buildah.version=1.41.4, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 27 21:10:30 compute-1 podman[182045]: 2026-01-27 21:10:30.503690829 +0000 UTC m=+0.026077278 image pull a5aa004c3a6db392cb04fafa2aacae4b2b1bb5836e3e54d23b692771193184c9 38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Jan 27 21:10:30 compute-1 python3[182009]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Jan 27 21:10:30 compute-1 sudo[182007]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:31 compute-1 sudo[182233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfvjpbjhemoumlzowaifqgflbiwdrgid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548230.9343977-2271-77743282176160/AnsiballZ_stat.py'
Jan 27 21:10:31 compute-1 sudo[182233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:31 compute-1 python3.9[182235]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:10:32 compute-1 sudo[182233]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:32 compute-1 sudo[182387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znzrwglpjsmpbtugfgnmshgrqzdfytzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548232.2994947-2289-155606856454672/AnsiballZ_file.py'
Jan 27 21:10:32 compute-1 sudo[182387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:32 compute-1 python3.9[182389]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:10:32 compute-1 sudo[182387]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:34 compute-1 sudo[182538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-narsfwdkuyhywccbauwrtrogxpngnmyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548232.8597207-2289-4036976635288/AnsiballZ_copy.py'
Jan 27 21:10:34 compute-1 sudo[182538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:34 compute-1 python3.9[182540]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769548232.8597207-2289-4036976635288/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:10:34 compute-1 sudo[182538]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:34 compute-1 sudo[182614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wznpywlosnbneexsxadtksywoazoabyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548232.8597207-2289-4036976635288/AnsiballZ_systemd.py'
Jan 27 21:10:34 compute-1 sudo[182614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:34 compute-1 python3.9[182616]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:10:34 compute-1 systemd[1]: Reloading.
Jan 27 21:10:34 compute-1 systemd-rc-local-generator[182635]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:10:34 compute-1 systemd-sysv-generator[182641]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:10:35 compute-1 sudo[182614]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:36 compute-1 sudo[182725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asovxmztdyrvrjqftfbiswesjwnkvpop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548232.8597207-2289-4036976635288/AnsiballZ_systemd.py'
Jan 27 21:10:36 compute-1 sudo[182725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:36 compute-1 python3.9[182727]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:10:36 compute-1 systemd[1]: Reloading.
Jan 27 21:10:36 compute-1 systemd-rc-local-generator[182755]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:10:36 compute-1 systemd-sysv-generator[182760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:10:36 compute-1 systemd[1]: Starting nova_compute container...
Jan 27 21:10:36 compute-1 systemd[1]: Started libcrun container.
Jan 27 21:10:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:36 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:36 compute-1 podman[182766]: 2026-01-27 21:10:36.967128991 +0000 UTC m=+0.148568690 container init 9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, container_name=nova_compute)
Jan 27 21:10:36 compute-1 podman[182766]: 2026-01-27 21:10:36.978153209 +0000 UTC m=+0.159592888 container start 9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:10:36 compute-1 podman[182766]: nova_compute
Jan 27 21:10:36 compute-1 nova_compute[182781]: + sudo -E kolla_set_configs
Jan 27 21:10:36 compute-1 systemd[1]: Started nova_compute container.
Jan 27 21:10:37 compute-1 sudo[182725]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Validating config file
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying service configuration files
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Deleting /etc/ceph
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Creating directory /etc/ceph
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /etc/ceph
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Writing out command to execute
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:37 compute-1 nova_compute[182781]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 27 21:10:37 compute-1 nova_compute[182781]: ++ cat /run_command
Jan 27 21:10:37 compute-1 nova_compute[182781]: + CMD=nova-compute
Jan 27 21:10:37 compute-1 nova_compute[182781]: + ARGS=
Jan 27 21:10:37 compute-1 nova_compute[182781]: + sudo kolla_copy_cacerts
Jan 27 21:10:37 compute-1 nova_compute[182781]: + [[ ! -n '' ]]
Jan 27 21:10:37 compute-1 nova_compute[182781]: + . kolla_extend_start
Jan 27 21:10:37 compute-1 nova_compute[182781]: + echo 'Running command: '\''nova-compute'\'''
Jan 27 21:10:37 compute-1 nova_compute[182781]: Running command: 'nova-compute'
Jan 27 21:10:37 compute-1 nova_compute[182781]: + umask 0022
Jan 27 21:10:37 compute-1 nova_compute[182781]: + exec nova-compute
Jan 27 21:10:38 compute-1 python3.9[182942]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.056 182785 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.056 182785 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.057 182785 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.057 182785 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.192 182785 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.219 182785 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.220 182785 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.252 182785 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 27 21:10:39 compute-1 nova_compute[182781]: 2026-01-27 21:10:39.254 182785 WARNING oslo_config.cfg [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 27 21:10:39 compute-1 python3.9[183095]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:10:40 compute-1 python3.9[183245]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.321 182785 INFO nova.virt.driver [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.404 182785 INFO nova.compute.provider_config [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.911 182785 DEBUG oslo_concurrency.lockutils [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.912 182785 DEBUG oslo_concurrency.lockutils [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.913 182785 DEBUG oslo_concurrency.lockutils [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.913 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.913 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.914 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.914 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.914 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.915 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.915 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.915 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.916 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.916 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.916 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.916 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.917 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.917 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.917 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.917 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.918 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.918 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.918 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.918 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.919 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.919 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.919 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.920 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.920 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.920 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.920 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.921 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.921 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.921 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.922 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.922 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.922 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.922 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.923 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.923 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.923 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.923 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.923 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.924 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.924 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.924 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.924 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.925 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.925 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.925 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.925 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.926 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.926 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.926 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.926 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.927 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.927 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.927 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.927 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.927 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.927 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.928 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.929 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.930 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.931 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.932 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.933 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.934 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.935 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.936 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.937 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.937 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.937 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.937 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.937 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.937 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.937 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.938 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.938 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.938 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.938 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.938 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.938 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.938 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.939 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.939 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.939 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.939 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.939 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.939 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.940 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.940 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.940 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.940 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.940 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.940 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.941 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.941 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.941 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.941 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.941 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.941 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.942 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.942 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.942 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.942 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.942 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.942 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.942 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.943 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.944 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.945 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.946 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.947 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.948 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.949 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.950 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.951 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.952 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.953 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.954 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.955 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.956 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.956 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.956 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.956 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.956 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.956 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.957 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.957 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.957 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.957 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.957 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.957 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.957 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.958 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.958 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.958 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.958 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.958 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.958 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.959 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.960 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.960 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.960 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.960 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.960 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.960 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.961 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.961 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.961 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.961 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.961 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.961 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.961 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.962 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.963 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.963 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.963 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.963 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.963 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.963 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.964 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.965 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.966 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.967 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.968 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.969 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.969 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.969 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.969 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.969 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.969 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.970 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.970 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.970 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.970 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.970 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.970 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.970 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.971 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.972 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.972 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.972 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.972 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.972 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.972 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.973 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.973 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.973 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.973 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.973 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.973 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.973 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.974 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.974 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.974 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.974 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.974 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.974 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.975 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.976 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.977 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.978 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.978 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.978 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.978 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.978 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.978 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.978 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.979 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.979 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.979 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.979 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.979 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.979 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.980 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.980 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.980 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.980 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.980 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.980 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.980 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.981 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.981 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.981 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.981 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.981 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.981 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.982 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.983 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.983 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.983 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.983 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.983 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.983 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.983 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.984 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.984 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.984 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.984 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.984 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.984 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.984 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.985 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.985 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.985 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.985 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.985 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.986 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.987 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.988 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.988 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.988 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.988 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.988 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.988 182785 WARNING oslo_config.cfg [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 27 21:10:40 compute-1 nova_compute[182781]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 27 21:10:40 compute-1 nova_compute[182781]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 27 21:10:40 compute-1 nova_compute[182781]: and ``live_migration_inbound_addr`` respectively.
Jan 27 21:10:40 compute-1 nova_compute[182781]: ).  Its value may be silently ignored in the future.
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.988 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.989 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.989 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.989 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.989 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.989 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.989 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.989 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.990 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.991 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.991 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.991 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.991 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.991 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.991 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.991 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.992 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.992 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.992 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.992 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.992 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.992 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.992 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.993 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.993 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.993 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.993 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.993 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.993 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.993 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.994 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.995 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.996 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.997 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.998 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.998 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.998 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.998 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.998 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.998 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.998 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.999 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.999 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.999 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.999 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.999 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:40 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.999 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:40.999 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.000 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.000 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.000 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.000 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.000 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.000 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.000 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.001 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.001 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.001 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.001 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.001 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.001 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.001 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.002 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.003 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.004 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.004 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.004 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.004 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.004 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.004 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.005 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.006 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.007 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.008 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.009 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.009 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.009 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.009 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.009 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.009 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.009 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.010 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.011 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.011 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.011 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.011 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.011 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.011 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.011 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.012 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.012 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.012 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.012 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.012 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.012 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.012 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.013 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.013 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.013 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.013 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.013 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.013 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.014 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.014 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.014 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.014 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.014 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.015 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.015 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.015 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.015 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.015 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.015 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.015 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.016 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.016 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.016 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.016 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.016 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.016 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.017 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.017 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.017 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.017 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.017 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.018 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.018 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.018 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.018 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.018 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.018 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.019 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.019 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.019 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.019 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.019 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.019 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.020 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.020 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.020 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.020 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.020 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.020 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.021 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.021 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.021 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.021 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.021 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.021 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.022 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.022 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.022 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.022 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.022 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.022 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.023 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.023 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.023 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.023 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.023 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.023 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.024 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.024 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.024 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.024 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.024 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.025 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.025 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.025 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.025 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.025 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.025 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.026 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.026 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.026 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.026 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.027 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.027 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.027 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.027 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.027 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.027 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.027 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.028 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.029 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.029 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.029 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.029 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.029 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.029 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.030 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.030 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.030 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.030 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.030 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.030 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.030 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.031 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.032 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.033 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.034 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.035 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.036 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.037 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.037 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.037 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.037 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.037 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.037 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.038 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.039 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.039 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.039 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.039 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.039 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.039 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.039 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.040 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.041 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.041 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.041 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.041 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.041 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.041 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.041 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.042 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.043 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.044 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.044 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.044 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.044 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.044 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.044 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.045 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.045 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.045 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.045 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.045 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.045 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.045 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.046 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.047 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.048 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.048 182785 DEBUG oslo_service.backend._eventlet.service [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.048 182785 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Jan 27 21:10:41 compute-1 sudo[183397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmgboeyxwprwdridttozwxyqkyzqgifw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548240.794493-2409-229122778827177/AnsiballZ_podman_container.py'
Jan 27 21:10:41 compute-1 sudo[183397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.555 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Jan 27 21:10:41 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 27 21:10:41 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.643 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5a8fa3f380> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Jan 27 21:10:41 compute-1 nova_compute[182781]: libvirt:  error : internal error: could not initialize domain event timer
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.644 182785 WARNING nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.644 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5a8fa3f380> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.646 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.647 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.647 182785 INFO nova.utils [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] The default thread pool MainProcess.default is initialized
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.647 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Jan 27 21:10:41 compute-1 nova_compute[182781]: 2026-01-27 21:10:41.648 182785 INFO nova.virt.libvirt.driver [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Connection event '1' reason 'None'
Jan 27 21:10:41 compute-1 python3.9[183399]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 27 21:10:41 compute-1 sudo[183397]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:41 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 21:10:41 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.156 182785 WARNING nova.virt.libvirt.driver [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.157 182785 DEBUG nova.virt.libvirt.volume.mount [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 27 21:10:42 compute-1 sudo[183632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pungljsdbujzgyaeptkjkzssgtygmwdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548242.2063394-2425-16685038035533/AnsiballZ_systemd.py'
Jan 27 21:10:42 compute-1 sudo[183632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.611 182785 INFO nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Libvirt host capabilities <capabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]: 
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <host>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <uuid>3b9a1f76-d315-49d8-90b4-a523eb6cf5fa</uuid>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <arch>x86_64</arch>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model>EPYC-Rome-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <vendor>AMD</vendor>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <microcode version='16777317'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <signature family='23' model='49' stepping='0'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='x2apic'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='tsc-deadline'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='osxsave'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='hypervisor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='tsc_adjust'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='spec-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='stibp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='arch-capabilities'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='cmp_legacy'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='topoext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='virt-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='lbrv'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='tsc-scale'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='vmcb-clean'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='pause-filter'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='pfthreshold'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='svme-addr-chk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='rdctl-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='skip-l1dfl-vmentry'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='mds-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature name='pschange-mc-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <pages unit='KiB' size='4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <pages unit='KiB' size='2048'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <pages unit='KiB' size='1048576'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <power_management>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <suspend_mem/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <suspend_disk/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <suspend_hybrid/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </power_management>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <iommu support='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <migration_features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <live/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <uri_transports>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <uri_transport>tcp</uri_transport>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <uri_transport>rdma</uri_transport>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </uri_transports>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </migration_features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <topology>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <cells num='1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <cell id='0'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           <memory unit='KiB'>7864308</memory>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           <pages unit='KiB' size='4'>1966077</pages>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           <pages unit='KiB' size='2048'>0</pages>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           <distances>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <sibling id='0' value='10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           </distances>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           <cpus num='8'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:           </cpus>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         </cell>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </cells>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </topology>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <cache>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </cache>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <secmodel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model>selinux</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <doi>0</doi>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </secmodel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <secmodel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model>dac</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <doi>0</doi>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </secmodel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </host>
Jan 27 21:10:42 compute-1 nova_compute[182781]: 
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <guest>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <os_type>hvm</os_type>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <arch name='i686'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <wordsize>32</wordsize>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <domain type='qemu'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <domain type='kvm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </arch>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <pae/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <nonpae/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <acpi default='on' toggle='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <apic default='on' toggle='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <cpuselection/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <deviceboot/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <disksnapshot default='on' toggle='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <externalSnapshot/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </guest>
Jan 27 21:10:42 compute-1 nova_compute[182781]: 
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <guest>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <os_type>hvm</os_type>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <arch name='x86_64'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <wordsize>64</wordsize>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <domain type='qemu'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <domain type='kvm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </arch>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <acpi default='on' toggle='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <apic default='on' toggle='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <cpuselection/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <deviceboot/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <disksnapshot default='on' toggle='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <externalSnapshot/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </guest>
Jan 27 21:10:42 compute-1 nova_compute[182781]: 
Jan 27 21:10:42 compute-1 nova_compute[182781]: </capabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]: 
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.620 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.652 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 27 21:10:42 compute-1 nova_compute[182781]: <domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <domain>kvm</domain>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <arch>i686</arch>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <vcpu max='4096'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <iothreads supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <os supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='firmware'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <loader supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>rom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pflash</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='readonly'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>yes</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='secure'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </loader>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </os>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='maximumMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <vendor>AMD</vendor>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='succor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='custom' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <memoryBacking supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='sourceType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>anonymous</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>memfd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </memoryBacking>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <disk supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='diskDevice'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>disk</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cdrom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>floppy</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>lun</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>fdc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>sata</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </disk>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <graphics supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vnc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egl-headless</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </graphics>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <video supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='modelType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vga</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cirrus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>none</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>bochs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ramfb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </video>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hostdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='mode'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>subsystem</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='startupPolicy'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>mandatory</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>requisite</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>optional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='subsysType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pci</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='capsType'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='pciBackend'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hostdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <rng supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>random</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </rng>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <filesystem supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='driverType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>path</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>handle</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtiofs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </filesystem>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tpm supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-tis</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-crb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emulator</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>external</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendVersion'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>2.0</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </tpm>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <redirdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </redirdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <channel supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </channel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <crypto supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </crypto>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <interface supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>passt</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </interface>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <panic supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>isa</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>hyperv</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </panic>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <console supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>null</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dev</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pipe</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stdio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>udp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tcp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu-vdagent</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </console>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <gic supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <genid supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backup supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <async-teardown supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <s390-pv supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <ps2 supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tdx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sev supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sgx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hyperv supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='features'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>relaxed</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vapic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>spinlocks</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vpindex</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>runtime</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>synic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stimer</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reset</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vendor_id</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>frequencies</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reenlightenment</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tlbflush</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ipi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>avic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emsr_bitmap</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>xmm_input</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hyperv>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <launchSecurity supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </features>
Jan 27 21:10:42 compute-1 nova_compute[182781]: </domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.661 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 27 21:10:42 compute-1 nova_compute[182781]: <domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <domain>kvm</domain>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <arch>i686</arch>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <vcpu max='240'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <iothreads supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <os supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='firmware'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <loader supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>rom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pflash</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='readonly'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>yes</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='secure'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </loader>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </os>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='maximumMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <vendor>AMD</vendor>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='succor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='custom' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <memoryBacking supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='sourceType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>anonymous</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>memfd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </memoryBacking>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <disk supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='diskDevice'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>disk</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cdrom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>floppy</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>lun</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ide</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>fdc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>sata</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </disk>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <graphics supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vnc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egl-headless</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </graphics>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <video supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='modelType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vga</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cirrus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>none</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>bochs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ramfb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </video>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hostdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='mode'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>subsystem</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='startupPolicy'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>mandatory</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>requisite</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>optional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='subsysType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pci</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='capsType'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='pciBackend'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hostdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <rng supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>random</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </rng>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <filesystem supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='driverType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>path</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>handle</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtiofs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </filesystem>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tpm supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-tis</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-crb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emulator</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>external</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendVersion'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>2.0</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </tpm>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <redirdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </redirdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <channel supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </channel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <crypto supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </crypto>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <interface supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>passt</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </interface>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <panic supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>isa</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>hyperv</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </panic>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <console supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>null</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dev</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pipe</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stdio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>udp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tcp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu-vdagent</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </console>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <gic supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <genid supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backup supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <async-teardown supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <s390-pv supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <ps2 supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tdx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sev supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sgx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hyperv supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='features'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>relaxed</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vapic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>spinlocks</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vpindex</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>runtime</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>synic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stimer</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reset</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vendor_id</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>frequencies</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reenlightenment</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tlbflush</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ipi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>avic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emsr_bitmap</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>xmm_input</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hyperv>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <launchSecurity supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </features>
Jan 27 21:10:42 compute-1 nova_compute[182781]: </domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.753 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.758 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 27 21:10:42 compute-1 nova_compute[182781]: <domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <domain>kvm</domain>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <arch>x86_64</arch>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <vcpu max='4096'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <iothreads supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <os supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='firmware'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>efi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <loader supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>rom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pflash</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='readonly'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>yes</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='secure'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>yes</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </loader>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </os>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='maximumMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <vendor>AMD</vendor>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='succor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='custom' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <memoryBacking supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='sourceType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>anonymous</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>memfd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </memoryBacking>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <disk supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='diskDevice'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>disk</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cdrom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>floppy</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>lun</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>fdc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>sata</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </disk>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <graphics supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vnc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egl-headless</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </graphics>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <video supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='modelType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vga</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cirrus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>none</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>bochs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ramfb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </video>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hostdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='mode'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>subsystem</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='startupPolicy'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>mandatory</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>requisite</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>optional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='subsysType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pci</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='capsType'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='pciBackend'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hostdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <rng supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>random</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </rng>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <filesystem supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='driverType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>path</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>handle</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtiofs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </filesystem>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tpm supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-tis</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-crb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emulator</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>external</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendVersion'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>2.0</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </tpm>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <redirdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </redirdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <channel supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </channel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <crypto supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </crypto>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <interface supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>passt</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </interface>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <panic supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>isa</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>hyperv</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </panic>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <console supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>null</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dev</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pipe</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stdio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>udp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tcp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu-vdagent</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </console>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <gic supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <genid supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backup supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <async-teardown supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <s390-pv supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <ps2 supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tdx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sev supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sgx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hyperv supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='features'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>relaxed</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vapic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>spinlocks</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vpindex</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>runtime</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>synic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stimer</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reset</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vendor_id</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>frequencies</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reenlightenment</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tlbflush</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ipi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>avic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emsr_bitmap</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>xmm_input</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hyperv>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <launchSecurity supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </features>
Jan 27 21:10:42 compute-1 nova_compute[182781]: </domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.831 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 27 21:10:42 compute-1 nova_compute[182781]: <domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <domain>kvm</domain>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <arch>x86_64</arch>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <vcpu max='240'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <iothreads supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <os supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='firmware'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <loader supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>rom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pflash</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='readonly'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>yes</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='secure'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>no</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </loader>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </os>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='maximumMigratable'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>on</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>off</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <vendor>AMD</vendor>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='succor'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <mode name='custom' supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ddpd-u'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sha512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm3'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sm4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Denverton-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amd-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='auto-ibrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='perfmon-v2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbpb'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='stibp-always-on'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='EPYC-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-128'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-256'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx10-512'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='prefetchiti'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Haswell-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 python3.9[183634]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512er'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512pf'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fma4'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tbm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xop'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='amx-tile'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-bf16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-fp16'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bitalg'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrc'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fzrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='la57'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='taa-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ifma'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cmpccxadd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fbsdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='fsrs'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ibrs-all'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='intel-psfd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='lam'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mcdt-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pbrsb-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='psdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='serialize'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vaes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='hle'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='rtm'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512bw'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512cd'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512dq'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512f'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='avx512vl'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='invpcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pcid'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='pku'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='mpx'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='core-capability'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='split-lock-detect'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='cldemote'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='erms'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='gfni'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdir64b'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='movdiri'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='xsaves'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='athlon-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='core2duo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='coreduo-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='n270-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='ss'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <blockers model='phenom-v1'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnow'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <feature name='3dnowext'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </blockers>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </mode>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </cpu>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <memoryBacking supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <enum name='sourceType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>anonymous</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <value>memfd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </memoryBacking>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <disk supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='diskDevice'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>disk</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cdrom</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>floppy</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>lun</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ide</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>fdc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>sata</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </disk>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <graphics supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vnc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egl-headless</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </graphics>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <video supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='modelType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vga</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>cirrus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>none</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>bochs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ramfb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </video>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hostdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='mode'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>subsystem</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='startupPolicy'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>mandatory</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>requisite</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>optional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='subsysType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pci</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>scsi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='capsType'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='pciBackend'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hostdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <rng supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtio-non-transitional</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>random</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>egd</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </rng>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <filesystem supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='driverType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>path</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>handle</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>virtiofs</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </filesystem>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tpm supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-tis</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tpm-crb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emulator</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>external</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendVersion'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>2.0</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </tpm>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <redirdev supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='bus'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>usb</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </redirdev>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <channel supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </channel>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <crypto supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendModel'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>builtin</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </crypto>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <interface supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='backendType'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>default</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>passt</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </interface>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <panic supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='model'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>isa</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>hyperv</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </panic>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <console supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='type'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>null</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vc</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pty</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dev</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>file</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>pipe</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stdio</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>udp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tcp</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>unix</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>qemu-vdagent</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>dbus</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </console>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </devices>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   <features>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <gic supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <genid supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <backup supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <async-teardown supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <s390-pv supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <ps2 supported='yes'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <tdx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sev supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <sgx supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <hyperv supported='yes'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <enum name='features'>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>relaxed</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vapic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>spinlocks</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vpindex</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>runtime</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>synic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>stimer</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reset</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>vendor_id</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>frequencies</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>reenlightenment</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>tlbflush</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>ipi</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>avic</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>emsr_bitmap</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <value>xmm_input</value>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </enum>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       <defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:42 compute-1 nova_compute[182781]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:42 compute-1 nova_compute[182781]:       </defaults>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     </hyperv>
Jan 27 21:10:42 compute-1 nova_compute[182781]:     <launchSecurity supported='no'/>
Jan 27 21:10:42 compute-1 nova_compute[182781]:   </features>
Jan 27 21:10:42 compute-1 nova_compute[182781]: </domainCapabilities>
Jan 27 21:10:42 compute-1 nova_compute[182781]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.909 182785 DEBUG nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.909 182785 INFO nova.virt.libvirt.host [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] Secure Boot support detected
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.914 182785 INFO nova.virt.libvirt.driver [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 27 21:10:42 compute-1 nova_compute[182781]: 2026-01-27 21:10:42.914 182785 INFO nova.virt.libvirt.driver [None req-9fa7edab-bdcf-4991-9db7-d23eba882a8d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 27 21:10:42 compute-1 systemd[1]: Stopping nova_compute container...
Jan 27 21:10:43 compute-1 nova_compute[182781]: 2026-01-27 21:10:43.053 182785 DEBUG oslo_concurrency.lockutils [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 21:10:43 compute-1 nova_compute[182781]: 2026-01-27 21:10:43.055 182785 DEBUG oslo_concurrency.lockutils [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 21:10:43 compute-1 nova_compute[182781]: 2026-01-27 21:10:43.055 182785 DEBUG oslo_concurrency.lockutils [None req-6489ec61-a951-451f-9ccc-899eb3ab1f32 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 21:10:43 compute-1 virtqemud[183420]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 27 21:10:43 compute-1 virtqemud[183420]: hostname: compute-1
Jan 27 21:10:43 compute-1 virtqemud[183420]: End of file while reading data: Input/output error
Jan 27 21:10:43 compute-1 systemd[1]: libpod-9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0.scope: Deactivated successfully.
Jan 27 21:10:43 compute-1 systemd[1]: libpod-9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0.scope: Consumed 3.171s CPU time.
Jan 27 21:10:43 compute-1 podman[183663]: 2026-01-27 21:10:43.610670899 +0000 UTC m=+0.606062102 container died 9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=edpm, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 21:10:43 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0-userdata-shm.mount: Deactivated successfully.
Jan 27 21:10:43 compute-1 systemd[1]: var-lib-containers-storage-overlay-98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1-merged.mount: Deactivated successfully.
Jan 27 21:10:43 compute-1 podman[183663]: 2026-01-27 21:10:43.675917235 +0000 UTC m=+0.671308438 container cleanup 9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute)
Jan 27 21:10:43 compute-1 podman[183663]: nova_compute
Jan 27 21:10:43 compute-1 podman[183680]: 2026-01-27 21:10:43.718780767 +0000 UTC m=+0.082612966 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 21:10:43 compute-1 podman[183714]: nova_compute
Jan 27 21:10:43 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 27 21:10:43 compute-1 systemd[1]: Stopped nova_compute container.
Jan 27 21:10:43 compute-1 systemd[1]: Starting nova_compute container...
Jan 27 21:10:43 compute-1 systemd[1]: Started libcrun container.
Jan 27 21:10:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:43 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98e32598b41c620a3f26e523954fea140245efc2ef2aa10a9d2d21a1890325c1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:43 compute-1 podman[183734]: 2026-01-27 21:10:43.837311257 +0000 UTC m=+0.074915101 container init 9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 27 21:10:43 compute-1 podman[183734]: 2026-01-27 21:10:43.845719639 +0000 UTC m=+0.083323483 container start 9ee62307ec8fc65ca0c71dfd43ca3cea2200726d548f6d09e202028328bcaec0 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 27 21:10:43 compute-1 nova_compute[183751]: + sudo -E kolla_set_configs
Jan 27 21:10:43 compute-1 podman[183734]: nova_compute
Jan 27 21:10:43 compute-1 systemd[1]: Started nova_compute container.
Jan 27 21:10:43 compute-1 sudo[183632]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Validating config file
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying service configuration files
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /etc/ceph
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Creating directory /etc/ceph
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /etc/ceph
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Writing out command to execute
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:43 compute-1 nova_compute[183751]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 27 21:10:43 compute-1 nova_compute[183751]: ++ cat /run_command
Jan 27 21:10:43 compute-1 nova_compute[183751]: + CMD=nova-compute
Jan 27 21:10:43 compute-1 nova_compute[183751]: + ARGS=
Jan 27 21:10:43 compute-1 nova_compute[183751]: + sudo kolla_copy_cacerts
Jan 27 21:10:43 compute-1 nova_compute[183751]: + [[ ! -n '' ]]
Jan 27 21:10:43 compute-1 nova_compute[183751]: + . kolla_extend_start
Jan 27 21:10:43 compute-1 nova_compute[183751]: Running command: 'nova-compute'
Jan 27 21:10:43 compute-1 nova_compute[183751]: + echo 'Running command: '\''nova-compute'\'''
Jan 27 21:10:43 compute-1 nova_compute[183751]: + umask 0022
Jan 27 21:10:43 compute-1 nova_compute[183751]: + exec nova-compute
Jan 27 21:10:45 compute-1 sudo[183912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjqekujjolqjzkteuryyoaefzmlybfmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548244.6515174-2443-147963123853858/AnsiballZ_podman_container.py'
Jan 27 21:10:45 compute-1 sudo[183912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:45 compute-1 python3.9[183914]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 27 21:10:45 compute-1 systemd[1]: Started libpod-conmon-9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158.scope.
Jan 27 21:10:45 compute-1 systemd[1]: Started libcrun container.
Jan 27 21:10:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5abde7a52a6c022602928249c393c2447220593928fbcdd2980d3f91c321a736/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5abde7a52a6c022602928249c393c2447220593928fbcdd2980d3f91c321a736/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5abde7a52a6c022602928249c393c2447220593928fbcdd2980d3f91c321a736/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 27 21:10:45 compute-1 podman[183938]: 2026-01-27 21:10:45.608084077 +0000 UTC m=+0.136629988 container init 9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init)
Jan 27 21:10:45 compute-1 podman[183938]: 2026-01-27 21:10:45.615228458 +0000 UTC m=+0.143774349 container start 9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 27 21:10:45 compute-1 python3.9[183914]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Applying nova statedir ownership
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 27 21:10:45 compute-1 nova_compute_init[183960]: INFO:nova_statedir:Nova statedir ownership complete
Jan 27 21:10:45 compute-1 systemd[1]: libpod-9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158.scope: Deactivated successfully.
Jan 27 21:10:45 compute-1 podman[183961]: 2026-01-27 21:10:45.670350328 +0000 UTC m=+0.028897350 container died 9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.build-date=20260126, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:10:45 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158-userdata-shm.mount: Deactivated successfully.
Jan 27 21:10:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-5abde7a52a6c022602928249c393c2447220593928fbcdd2980d3f91c321a736-merged.mount: Deactivated successfully.
Jan 27 21:10:45 compute-1 podman[183974]: 2026-01-27 21:10:45.736509838 +0000 UTC m=+0.051971913 container cleanup 9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158 (image=38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.195:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 21:10:45 compute-1 systemd[1]: libpod-conmon-9aac9a1db80effee9334b61e2430f8f9192b01860ea0f378e0fa085acac1b158.scope: Deactivated successfully.
Jan 27 21:10:45 compute-1 sudo[183912]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:45 compute-1 nova_compute[183751]: 2026-01-27 21:10:45.976 183755 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 27 21:10:45 compute-1 nova_compute[183751]: 2026-01-27 21:10:45.977 183755 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 27 21:10:45 compute-1 nova_compute[183751]: 2026-01-27 21:10:45.977 183755 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Jan 27 21:10:45 compute-1 nova_compute[183751]: 2026-01-27 21:10:45.978 183755 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 27 21:10:46 compute-1 nova_compute[183751]: 2026-01-27 21:10:46.099 183755 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:10:46 compute-1 nova_compute[183751]: 2026-01-27 21:10:46.114 183755 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:10:46 compute-1 nova_compute[183751]: 2026-01-27 21:10:46.114 183755 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 27 21:10:46 compute-1 nova_compute[183751]: 2026-01-27 21:10:46.147 183755 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Jan 27 21:10:46 compute-1 nova_compute[183751]: 2026-01-27 21:10:46.148 183755 WARNING oslo_config.cfg [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Jan 27 21:10:46 compute-1 sshd-session[160622]: Connection closed by 192.168.122.30 port 44734
Jan 27 21:10:46 compute-1 sshd-session[160619]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:10:46 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Jan 27 21:10:46 compute-1 systemd[1]: session-25.scope: Consumed 1min 43.741s CPU time.
Jan 27 21:10:46 compute-1 systemd-logind[786]: Session 25 logged out. Waiting for processes to exit.
Jan 27 21:10:46 compute-1 systemd-logind[786]: Removed session 25.
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.085 183755 INFO nova.virt.driver [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.182 183755 INFO nova.compute.provider_config [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.695 183755 DEBUG oslo_concurrency.lockutils [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.696 183755 DEBUG oslo_concurrency.lockutils [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.696 183755 DEBUG oslo_concurrency.lockutils [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.697 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.697 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.697 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.698 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.698 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.699 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.699 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.699 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.700 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.700 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.700 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.701 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.701 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.701 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.701 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.702 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.702 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.702 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.703 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.703 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.703 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.704 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.704 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.704 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.705 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.705 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.705 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.705 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.706 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.706 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.706 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.707 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.707 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.707 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.707 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.708 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.708 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.708 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.708 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.709 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.709 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.709 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.710 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.710 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.710 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.711 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.711 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.711 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.712 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.712 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.712 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.712 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.713 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.713 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.713 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.714 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.714 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.714 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.714 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.715 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.715 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.715 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.715 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.715 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.716 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.716 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.716 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.716 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.717 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.717 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.717 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.717 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.718 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.718 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.718 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.718 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.719 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.719 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.719 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.719 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.720 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.720 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.720 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] my_shared_fs_storage_ip        = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.721 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.721 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.721 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.721 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.722 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.722 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.723 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.723 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.723 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.724 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.724 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.724 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.725 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.725 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.725 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.725 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.726 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.726 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.727 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.727 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.727 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.728 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.728 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.728 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.729 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.729 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.729 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.730 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.730 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.730 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.730 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.731 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.731 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.731 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.731 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.732 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.732 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.732 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.733 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.733 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.733 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.733 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.733 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.734 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.734 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.734 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.734 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.735 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.735 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.735 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.735 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.736 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.736 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.736 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.736 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.737 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.737 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.737 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.738 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.738 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.738 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.738 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.739 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.739 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.739 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.739 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.740 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.740 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.740 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.741 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.741 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.741 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.741 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.742 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.742 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.742 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.742 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.743 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.743 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.743 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.743 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.744 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.744 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.744 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.744 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.745 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.745 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.745 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.745 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.746 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.746 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.746 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.746 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.747 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.747 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.747 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.747 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.748 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.748 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.748 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.748 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.749 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.749 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.749 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.749 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.749 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.750 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.750 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.750 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.750 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.750 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.750 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.751 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.751 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.751 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.751 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.751 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.752 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.752 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.752 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.752 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.752 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.752 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.753 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.753 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.753 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.753 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.753 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.754 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.754 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.754 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.754 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.754 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.754 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.755 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.755 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.755 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.755 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.755 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.755 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.756 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.756 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.756 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.756 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.756 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.756 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.757 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.757 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.757 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.757 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.757 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.757 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.758 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.758 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.758 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.758 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.758 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.758 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.759 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.759 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.759 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.759 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.759 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.759 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.760 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.760 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.760 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.760 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.760 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.761 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.761 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.761 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.761 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.761 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.761 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.762 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.762 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.762 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.762 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.762 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.762 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.763 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.763 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.763 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.763 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.763 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.763 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.764 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.764 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.764 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.764 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.764 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.764 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.765 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.765 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.765 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.765 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.765 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.765 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.766 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.766 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.766 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.766 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.766 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.766 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.767 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.767 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.767 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.767 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.767 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.767 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.768 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.768 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.768 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.768 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.768 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.768 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.769 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.769 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.769 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.769 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.769 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.769 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.770 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.770 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.770 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.770 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.770 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.770 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.771 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.771 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.771 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.771 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.771 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.771 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.772 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.772 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.772 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.772 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.772 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.773 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.773 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.773 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.773 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.773 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.773 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.774 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.774 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.774 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.774 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.775 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.775 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.775 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.775 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.775 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.775 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.776 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.776 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.776 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.776 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.777 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.777 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.777 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 podman[184030]: 2026-01-27 21:10:47.777749611 +0000 UTC m=+0.074203553 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.777 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.778 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.778 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.778 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.779 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.779 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.779 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.779 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.779 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.780 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.780 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.780 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.780 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.781 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.781 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.781 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.781 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.781 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.782 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.782 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.782 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.782 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.782 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.783 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.783 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.783 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.783 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.784 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.784 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.784 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.784 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.785 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.785 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.785 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.785 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.785 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.785 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.785 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.786 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.786 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.786 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.786 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.786 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.786 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.786 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.787 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.788 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.789 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.790 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.791 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.792 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.793 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.794 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.795 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.796 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.796 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.796 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.796 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.796 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.796 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.796 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.797 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.798 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.799 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.800 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.801 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.801 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.801 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.801 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.801 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.801 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.801 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.802 183755 WARNING oslo_config.cfg [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 27 21:10:47 compute-1 nova_compute[183751]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 27 21:10:47 compute-1 nova_compute[183751]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 27 21:10:47 compute-1 nova_compute[183751]: and ``live_migration_inbound_addr`` respectively.
Jan 27 21:10:47 compute-1 nova_compute[183751]: ).  Its value may be silently ignored in the future.
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.802 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.802 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.802 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.802 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.802 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.migration_inbound_addr = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.802 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.803 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.804 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.804 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.804 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.804 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.804 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.804 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.804 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.805 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.806 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.807 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.808 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.809 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.810 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.811 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.812 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.813 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.814 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.815 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.816 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.817 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.818 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.819 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.820 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.821 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.822 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.823 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.824 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.825 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.825 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.825 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.825 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.825 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.825 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.825 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.826 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.827 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.827 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.827 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.827 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.827 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.827 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.827 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.828 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.828 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.828 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.828 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.828 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.828 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.828 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.829 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.829 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.829 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.829 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.829 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.829 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.829 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.830 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.831 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.832 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.833 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.834 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.834 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.834 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.834 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.834 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.834 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.834 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.835 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.836 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.837 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.838 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.839 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.hostname = compute-1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.840 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.841 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.842 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.843 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.844 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.845 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.846 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.847 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.848 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.849 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.850 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.850 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.850 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.850 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.850 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.850 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.850 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.851 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.852 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.853 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.854 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.854 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.854 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.854 183755 DEBUG oslo_service.backend._eventlet.service [None req-560c056f-ffb5-4ade-bb68-6c5d01ad2faa - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Jan 27 21:10:47 compute-1 nova_compute[183751]: 2026-01-27 21:10:47.855 183755 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.362 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.376 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f126d527bf0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Jan 27 21:10:48 compute-1 nova_compute[183751]: libvirt:  error : internal error: could not initialize domain event timer
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.377 183755 WARNING nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.377 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f126d527bf0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.379 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.380 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.380 183755 INFO nova.utils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] The default thread pool MainProcess.default is initialized
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.381 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.381 183755 INFO nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Connection event '1' reason 'None'
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.388 183755 INFO nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Libvirt host capabilities <capabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]: 
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <host>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <uuid>3b9a1f76-d315-49d8-90b4-a523eb6cf5fa</uuid>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <arch>x86_64</arch>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model>EPYC-Rome-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <vendor>AMD</vendor>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <microcode version='16777317'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <signature family='23' model='49' stepping='0'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='x2apic'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='tsc-deadline'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='osxsave'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='hypervisor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='tsc_adjust'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='spec-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='stibp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='arch-capabilities'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='cmp_legacy'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='topoext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='virt-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='lbrv'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='tsc-scale'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='vmcb-clean'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='pause-filter'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='pfthreshold'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='svme-addr-chk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='rdctl-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='skip-l1dfl-vmentry'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='mds-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature name='pschange-mc-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <pages unit='KiB' size='4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <pages unit='KiB' size='2048'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <pages unit='KiB' size='1048576'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <power_management>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <suspend_mem/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <suspend_disk/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <suspend_hybrid/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </power_management>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <iommu support='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <migration_features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <live/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <uri_transports>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <uri_transport>tcp</uri_transport>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <uri_transport>rdma</uri_transport>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </uri_transports>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </migration_features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <topology>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <cells num='1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <cell id='0'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           <memory unit='KiB'>7864308</memory>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           <pages unit='KiB' size='4'>1966077</pages>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           <pages unit='KiB' size='2048'>0</pages>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           <distances>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <sibling id='0' value='10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           </distances>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           <cpus num='8'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:           </cpus>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         </cell>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </cells>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </topology>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <cache>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </cache>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <secmodel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model>selinux</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <doi>0</doi>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </secmodel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <secmodel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model>dac</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <doi>0</doi>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </secmodel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </host>
Jan 27 21:10:48 compute-1 nova_compute[183751]: 
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <guest>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <os_type>hvm</os_type>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <arch name='i686'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <wordsize>32</wordsize>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <domain type='qemu'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <domain type='kvm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </arch>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <pae/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <nonpae/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <acpi default='on' toggle='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <apic default='on' toggle='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <cpuselection/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <deviceboot/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <disksnapshot default='on' toggle='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <externalSnapshot/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </guest>
Jan 27 21:10:48 compute-1 nova_compute[183751]: 
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <guest>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <os_type>hvm</os_type>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <arch name='x86_64'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <wordsize>64</wordsize>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <domain type='qemu'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <domain type='kvm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </arch>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <acpi default='on' toggle='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <apic default='on' toggle='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <cpuselection/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <deviceboot/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <disksnapshot default='on' toggle='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <externalSnapshot/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </guest>
Jan 27 21:10:48 compute-1 nova_compute[183751]: 
Jan 27 21:10:48 compute-1 nova_compute[183751]: </capabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]: 
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.397 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.400 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 27 21:10:48 compute-1 nova_compute[183751]: <domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <domain>kvm</domain>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <arch>i686</arch>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <vcpu max='240'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <iothreads supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <os supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='firmware'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <loader supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>rom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pflash</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='readonly'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>yes</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='secure'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </loader>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </os>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='maximumMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <vendor>AMD</vendor>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='succor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='custom' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <memoryBacking supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='sourceType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>anonymous</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>memfd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </memoryBacking>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <disk supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='diskDevice'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>disk</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cdrom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>floppy</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>lun</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ide</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>fdc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>sata</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </disk>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <graphics supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vnc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egl-headless</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </graphics>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <video supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='modelType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vga</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cirrus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>none</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>bochs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ramfb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </video>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hostdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='mode'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>subsystem</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='startupPolicy'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>mandatory</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>requisite</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>optional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='subsysType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pci</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='capsType'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='pciBackend'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hostdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <rng supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>random</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </rng>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <filesystem supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='driverType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>path</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>handle</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtiofs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </filesystem>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tpm supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-tis</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-crb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emulator</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>external</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendVersion'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>2.0</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </tpm>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <redirdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </redirdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <channel supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </channel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <crypto supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </crypto>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <interface supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>passt</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </interface>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <panic supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>isa</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>hyperv</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </panic>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <console supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>null</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dev</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pipe</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stdio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>udp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tcp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu-vdagent</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </console>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <gic supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <genid supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backup supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <async-teardown supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <s390-pv supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <ps2 supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tdx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sev supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sgx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hyperv supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='features'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>relaxed</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vapic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>spinlocks</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vpindex</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>runtime</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>synic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stimer</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reset</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vendor_id</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>frequencies</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reenlightenment</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tlbflush</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ipi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>avic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emsr_bitmap</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>xmm_input</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hyperv>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <launchSecurity supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </features>
Jan 27 21:10:48 compute-1 nova_compute[183751]: </domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.412 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 27 21:10:48 compute-1 nova_compute[183751]: <domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <domain>kvm</domain>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <arch>i686</arch>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <vcpu max='4096'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <iothreads supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <os supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='firmware'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <loader supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>rom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pflash</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='readonly'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>yes</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='secure'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </loader>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </os>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='maximumMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <vendor>AMD</vendor>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='succor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='custom' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <memoryBacking supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='sourceType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>anonymous</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>memfd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </memoryBacking>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <disk supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='diskDevice'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>disk</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cdrom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>floppy</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>lun</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>fdc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>sata</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </disk>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <graphics supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vnc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egl-headless</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </graphics>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <video supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='modelType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vga</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cirrus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>none</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>bochs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ramfb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </video>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hostdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='mode'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>subsystem</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='startupPolicy'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>mandatory</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>requisite</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>optional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='subsysType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pci</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='capsType'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='pciBackend'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hostdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <rng supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>random</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </rng>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <filesystem supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='driverType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>path</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>handle</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtiofs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </filesystem>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tpm supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-tis</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-crb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emulator</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>external</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendVersion'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>2.0</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </tpm>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <redirdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </redirdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <channel supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </channel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <crypto supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </crypto>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <interface supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>passt</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </interface>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <panic supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>isa</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>hyperv</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </panic>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <console supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>null</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dev</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pipe</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stdio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>udp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tcp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu-vdagent</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </console>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <gic supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <genid supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backup supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <async-teardown supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <s390-pv supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <ps2 supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tdx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sev supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sgx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hyperv supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='features'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>relaxed</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vapic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>spinlocks</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vpindex</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>runtime</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>synic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stimer</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reset</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vendor_id</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>frequencies</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reenlightenment</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tlbflush</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ipi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>avic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emsr_bitmap</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>xmm_input</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hyperv>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <launchSecurity supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </features>
Jan 27 21:10:48 compute-1 nova_compute[183751]: </domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.459 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.464 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 27 21:10:48 compute-1 nova_compute[183751]: <domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <domain>kvm</domain>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <arch>x86_64</arch>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <vcpu max='240'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <iothreads supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <os supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='firmware'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <loader supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>rom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pflash</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='readonly'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>yes</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='secure'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </loader>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </os>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='maximumMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <vendor>AMD</vendor>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='succor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='custom' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <memoryBacking supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='sourceType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>anonymous</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>memfd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </memoryBacking>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <disk supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='diskDevice'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>disk</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cdrom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>floppy</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>lun</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ide</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>fdc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>sata</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </disk>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <graphics supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vnc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egl-headless</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </graphics>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <video supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='modelType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vga</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cirrus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>none</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>bochs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ramfb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </video>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hostdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='mode'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>subsystem</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='startupPolicy'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>mandatory</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>requisite</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>optional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='subsysType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pci</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='capsType'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='pciBackend'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hostdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <rng supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>random</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </rng>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <filesystem supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='driverType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>path</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>handle</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtiofs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </filesystem>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tpm supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-tis</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-crb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emulator</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>external</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendVersion'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>2.0</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </tpm>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <redirdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </redirdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <channel supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </channel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <crypto supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </crypto>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <interface supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>passt</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </interface>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <panic supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>isa</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>hyperv</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </panic>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <console supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>null</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dev</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pipe</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stdio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>udp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tcp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu-vdagent</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </console>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <gic supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <genid supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backup supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <async-teardown supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <s390-pv supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <ps2 supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tdx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sev supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sgx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hyperv supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='features'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>relaxed</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vapic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>spinlocks</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vpindex</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>runtime</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>synic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stimer</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reset</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vendor_id</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>frequencies</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reenlightenment</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tlbflush</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ipi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>avic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emsr_bitmap</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>xmm_input</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hyperv>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <launchSecurity supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </features>
Jan 27 21:10:48 compute-1 nova_compute[183751]: </domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.542 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 27 21:10:48 compute-1 nova_compute[183751]: <domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <path>/usr/libexec/qemu-kvm</path>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <domain>kvm</domain>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <arch>x86_64</arch>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <vcpu max='4096'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <iothreads supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <os supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='firmware'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>efi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <loader supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>rom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pflash</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='readonly'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>yes</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='secure'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>yes</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>no</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </loader>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </os>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-passthrough' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='hostPassthroughMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='maximum' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='maximumMigratable'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>on</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>off</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='host-model' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <vendor>AMD</vendor>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='x2apic'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-deadline'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='hypervisor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc_adjust'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='spec-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='stibp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='cmp_legacy'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='overflow-recov'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='succor'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='amd-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='virt-ssbd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lbrv'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='tsc-scale'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='vmcb-clean'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='flushbyasid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pause-filter'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='pfthreshold'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='svme-addr-chk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <feature policy='disable' name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <mode name='custom' supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Broadwell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cascadelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='ClearwaterForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ddpd-u'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sha512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm3'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sm4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Cooperlake-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Denverton-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Dhyana-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Genoa-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Milan-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Rome-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-Turin-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amd-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='auto-ibrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vp2intersect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fs-gs-base-ns'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibpb-brtype'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='no-nested-data-bp'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='null-sel-clr-base'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='perfmon-v2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbpb'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='srso-user-kernel-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='stibp-always-on'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='EPYC-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='GraniteRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-128'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-256'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx10-512'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='prefetchiti'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Haswell-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-noTSX'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v6'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Icelake-Server-v7'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='IvyBridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='KnightsMill-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4fmaps'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-4vnniw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512er'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512pf'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G4-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Opteron_G5-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fma4'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tbm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xop'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SapphireRapids-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='amx-tile'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-bf16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-fp16'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512-vpopcntdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bitalg'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vbmi2'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrc'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fzrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='la57'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='taa-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='tsx-ldtrk'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='SierraForest-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ifma'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-ne-convert'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx-vnni-int8'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bhi-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='bus-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cmpccxadd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fbsdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='fsrs'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ibrs-all'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='intel-psfd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ipred-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='lam'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mcdt-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pbrsb-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='psdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rrsba-ctrl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='sbdr-ssdp-no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='serialize'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vaes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='vpclmulqdq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Client-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='hle'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='rtm'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Skylake-Server-v5'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512bw'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512cd'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512dq'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512f'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='avx512vl'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='invpcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pcid'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='pku'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='mpx'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v2'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v3'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='core-capability'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='split-lock-detect'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='Snowridge-v4'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='cldemote'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='erms'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='gfni'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdir64b'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='movdiri'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='xsaves'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='athlon-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='core2duo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='coreduo-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='n270-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='ss'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <blockers model='phenom-v1'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnow'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <feature name='3dnowext'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </blockers>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </mode>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <memoryBacking supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <enum name='sourceType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>anonymous</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <value>memfd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </memoryBacking>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <disk supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='diskDevice'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>disk</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cdrom</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>floppy</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>lun</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>fdc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>sata</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </disk>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <graphics supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vnc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egl-headless</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </graphics>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <video supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='modelType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vga</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>cirrus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>none</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>bochs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ramfb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </video>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hostdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='mode'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>subsystem</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='startupPolicy'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>mandatory</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>requisite</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>optional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='subsysType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pci</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>scsi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='capsType'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='pciBackend'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hostdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <rng supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtio-non-transitional</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>random</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>egd</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </rng>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <filesystem supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='driverType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>path</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>handle</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>virtiofs</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </filesystem>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tpm supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-tis</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tpm-crb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emulator</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>external</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendVersion'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>2.0</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </tpm>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <redirdev supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='bus'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>usb</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </redirdev>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <channel supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </channel>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <crypto supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendModel'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>builtin</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </crypto>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <interface supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='backendType'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>default</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>passt</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </interface>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <panic supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='model'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>isa</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>hyperv</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </panic>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <console supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='type'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>null</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vc</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pty</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dev</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>file</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>pipe</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stdio</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>udp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tcp</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>unix</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>qemu-vdagent</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>dbus</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </console>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </devices>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <features>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <gic supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <vmcoreinfo supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <genid supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backingStoreInput supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <backup supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <async-teardown supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <s390-pv supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <ps2 supported='yes'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <tdx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sev supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <sgx supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <hyperv supported='yes'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <enum name='features'>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>relaxed</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vapic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>spinlocks</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vpindex</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>runtime</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>synic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>stimer</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reset</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>vendor_id</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>frequencies</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>reenlightenment</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>tlbflush</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>ipi</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>avic</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>emsr_bitmap</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <value>xmm_input</value>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </enum>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       <defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <spinlocks>4095</spinlocks>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <stimer_direct>on</stimer_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_direct>on</tlbflush_direct>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <tlbflush_extended>on</tlbflush_extended>
Jan 27 21:10:48 compute-1 nova_compute[183751]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 27 21:10:48 compute-1 nova_compute[183751]:       </defaults>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     </hyperv>
Jan 27 21:10:48 compute-1 nova_compute[183751]:     <launchSecurity supported='no'/>
Jan 27 21:10:48 compute-1 nova_compute[183751]:   </features>
Jan 27 21:10:48 compute-1 nova_compute[183751]: </domainCapabilities>
Jan 27 21:10:48 compute-1 nova_compute[183751]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.656 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.657 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.657 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.679 183755 INFO nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Secure Boot support detected
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.688 183755 INFO nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.688 183755 INFO nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.701 183755 DEBUG nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 27 21:10:48 compute-1 nova_compute[183751]:   <model>Nehalem</model>
Jan 27 21:10:48 compute-1 nova_compute[183751]: </cpu>
Jan 27 21:10:48 compute-1 nova_compute[183751]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.703 183755 DEBUG nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.889 183755 WARNING nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 27 21:10:48 compute-1 nova_compute[183751]: 2026-01-27 21:10:48.889 183755 DEBUG nova.virt.libvirt.volume.mount [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 27 21:10:49 compute-1 nova_compute[183751]: 2026-01-27 21:10:49.216 183755 INFO nova.virt.node [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Determined node identity 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from /var/lib/nova/compute_id
Jan 27 21:10:49 compute-1 nova_compute[183751]: 2026-01-27 21:10:49.726 183755 WARNING nova.compute.manager [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Compute nodes ['18406e9c-09cc-4d76-bc69-d3d1c0683e05'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 27 21:10:50 compute-1 nova_compute[183751]: 2026-01-27 21:10:50.738 183755 INFO nova.compute.manager [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.754 183755 WARNING nova.compute.manager [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.755 183755 DEBUG oslo_concurrency.lockutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.755 183755 DEBUG oslo_concurrency.lockutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.756 183755 DEBUG oslo_concurrency.lockutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.756 183755 DEBUG nova.compute.resource_tracker [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.903 183755 WARNING nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.904 183755 DEBUG oslo_concurrency.processutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.922 183755 DEBUG oslo_concurrency.processutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.922 183755 DEBUG nova.compute.resource_tracker [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6181MB free_disk=73.34655380249023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.923 183755 DEBUG oslo_concurrency.lockutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:10:51 compute-1 nova_compute[183751]: 2026-01-27 21:10:51.923 183755 DEBUG oslo_concurrency.lockutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:10:51 compute-1 sshd-session[184069]: Accepted publickey for zuul from 192.168.122.30 port 38072 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 21:10:51 compute-1 systemd-logind[786]: New session 27 of user zuul.
Jan 27 21:10:52 compute-1 systemd[1]: Started Session 27 of User zuul.
Jan 27 21:10:52 compute-1 sshd-session[184069]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 21:10:52 compute-1 nova_compute[183751]: 2026-01-27 21:10:52.429 183755 WARNING nova.compute.resource_tracker [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] No compute node record for compute-1.ctlplane.example.com:18406e9c-09cc-4d76-bc69-d3d1c0683e05: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 18406e9c-09cc-4d76-bc69-d3d1c0683e05 could not be found.
Jan 27 21:10:52 compute-1 nova_compute[183751]: 2026-01-27 21:10:52.943 183755 INFO nova.compute.resource_tracker [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 18406e9c-09cc-4d76-bc69-d3d1c0683e05
Jan 27 21:10:53 compute-1 python3.9[184223]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 27 21:10:54 compute-1 sudo[184377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aweqllnqbspoxggtmrxysyliuisjgpyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548253.7442458-48-215913230297844/AnsiballZ_systemd_service.py'
Jan 27 21:10:54 compute-1 sudo[184377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:10:54 compute-1 nova_compute[183751]: 2026-01-27 21:10:54.467 183755 DEBUG nova.compute.resource_tracker [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:10:54 compute-1 nova_compute[183751]: 2026-01-27 21:10:54.468 183755 DEBUG nova.compute.resource_tracker [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:10:51 up  1:13,  0 user,  load average: 1.01, 0.74, 0.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:10:54 compute-1 python3.9[184379]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:10:54 compute-1 systemd[1]: Reloading.
Jan 27 21:10:54 compute-1 systemd-sysv-generator[184405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:10:54 compute-1 systemd-rc-local-generator[184402]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:10:54 compute-1 sudo[184377]: pam_unix(sudo:session): session closed for user root
Jan 27 21:10:55 compute-1 nova_compute[183751]: 2026-01-27 21:10:55.388 183755 INFO nova.scheduler.client.report [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] [req-de7738d3-54da-4a05-876a-6b61246b53c6] Created resource provider record via placement API for resource provider with UUID 18406e9c-09cc-4d76-bc69-d3d1c0683e05 and name compute-1.ctlplane.example.com.
Jan 27 21:10:55 compute-1 nova_compute[183751]: 2026-01-27 21:10:55.480 183755 DEBUG nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 27 21:10:55 compute-1 nova_compute[183751]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Jan 27 21:10:55 compute-1 nova_compute[183751]: 2026-01-27 21:10:55.480 183755 INFO nova.virt.libvirt.host [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] kernel doesn't support AMD SEV
Jan 27 21:10:55 compute-1 nova_compute[183751]: 2026-01-27 21:10:55.480 183755 DEBUG nova.compute.provider_tree [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:10:55 compute-1 nova_compute[183751]: 2026-01-27 21:10:55.481 183755 DEBUG nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 21:10:55 compute-1 nova_compute[183751]: 2026-01-27 21:10:55.484 183755 DEBUG nova.virt.libvirt.driver [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Libvirt baseline CPU <cpu>
Jan 27 21:10:55 compute-1 nova_compute[183751]:   <arch>x86_64</arch>
Jan 27 21:10:55 compute-1 nova_compute[183751]:   <model>Nehalem</model>
Jan 27 21:10:55 compute-1 nova_compute[183751]:   <vendor>AMD</vendor>
Jan 27 21:10:55 compute-1 nova_compute[183751]:   <topology sockets="8" cores="1" threads="1"/>
Jan 27 21:10:55 compute-1 nova_compute[183751]:   <maxphysaddr mode="emulate" bits="40"/>
Jan 27 21:10:55 compute-1 nova_compute[183751]: </cpu>
Jan 27 21:10:55 compute-1 nova_compute[183751]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Jan 27 21:10:55 compute-1 python3.9[184564]: ansible-ansible.builtin.service_facts Invoked
Jan 27 21:10:55 compute-1 network[184581]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 27 21:10:55 compute-1 network[184582]: 'network-scripts' will be removed from distribution in near future.
Jan 27 21:10:55 compute-1 network[184583]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.074 183755 DEBUG nova.scheduler.client.report [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Updated inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.075 183755 DEBUG nova.compute.provider_tree [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Updating resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.075 183755 DEBUG nova.compute.provider_tree [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.199 183755 DEBUG nova.compute.provider_tree [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Updating resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.707 183755 DEBUG nova.compute.resource_tracker [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.707 183755 DEBUG oslo_concurrency.lockutils [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.784s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.708 183755 DEBUG nova.service [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.780 183755 DEBUG nova.service [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Jan 27 21:10:56 compute-1 nova_compute[183751]: 2026-01-27 21:10:56.781 183755 DEBUG nova.servicegroup.drivers.db [None req-426496b9-d828-4840-b7ea-171217855781 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Jan 27 21:11:01 compute-1 sudo[184853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbmjftjebdeyfqrplubxeldqgihklqhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548260.7920194-86-173651299692655/AnsiballZ_systemd_service.py'
Jan 27 21:11:01 compute-1 sudo[184853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:01 compute-1 python3.9[184855]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:11:01 compute-1 sudo[184853]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:02 compute-1 sudo[185006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wktbgavuiyzzlijvhzgwdoqqqbsjbadq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548261.882588-106-153916132829849/AnsiballZ_file.py'
Jan 27 21:11:02 compute-1 sudo[185006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:02 compute-1 python3.9[185008]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:02 compute-1 sudo[185006]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:02 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 21:11:03 compute-1 sudo[185159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqpzcizjcefxctoocoxfuujcjudeozhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548262.8409472-122-151306854262970/AnsiballZ_file.py'
Jan 27 21:11:03 compute-1 sudo[185159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:03 compute-1 python3.9[185161]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:03 compute-1 sudo[185159]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:04 compute-1 sudo[185311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iumzmwgapjcztalqcwvydcwkmpercjis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548263.840515-140-42712629730343/AnsiballZ_command.py'
Jan 27 21:11:04 compute-1 sudo[185311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:04 compute-1 python3.9[185313]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:11:04 compute-1 sudo[185311]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:07 compute-1 python3.9[185465]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 27 21:11:08 compute-1 sudo[185615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sudvuypahgyymhmljigdogqwtjjoqhhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548267.4331708-176-131425723621190/AnsiballZ_systemd_service.py'
Jan 27 21:11:08 compute-1 sudo[185615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:08 compute-1 python3.9[185617]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:11:08 compute-1 systemd[1]: Reloading.
Jan 27 21:11:08 compute-1 systemd-rc-local-generator[185641]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:11:08 compute-1 systemd-sysv-generator[185646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:11:09 compute-1 sudo[185615]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:09 compute-1 sudo[185803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afjwjwbxeontozjyfbcyrufyinshirqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548269.412184-192-263844867972104/AnsiballZ_command.py'
Jan 27 21:11:09 compute-1 sudo[185803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:09 compute-1 nova_compute[183751]: 2026-01-27 21:11:09.784 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:09 compute-1 python3.9[185805]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:11:09 compute-1 sudo[185803]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:10 compute-1 nova_compute[183751]: 2026-01-27 21:11:10.296 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:10 compute-1 sudo[185956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxrjbsyfuoglajfuznimtbladpwnfgcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548270.3204007-210-133305080174023/AnsiballZ_file.py'
Jan 27 21:11:10 compute-1 sudo[185956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:10 compute-1 python3.9[185958]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:11:10 compute-1 sudo[185956]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:11:11.150 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:11:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:11:11.150 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:11:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:11:11.150 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:11:11 compute-1 python3.9[186109]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:12 compute-1 sudo[186261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfinefwjnqabzmtbbpbcwotbvzwmncdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548271.948189-242-131370543814879/AnsiballZ_group.py'
Jan 27 21:11:12 compute-1 sudo[186261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:12 compute-1 python3.9[186263]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 27 21:11:12 compute-1 sudo[186261]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:13 compute-1 sudo[186413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uowlloaspcealseysuodqlfusrwtllvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548273.007055-264-187474865837368/AnsiballZ_getent.py'
Jan 27 21:11:13 compute-1 sudo[186413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:13 compute-1 python3.9[186415]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 27 21:11:13 compute-1 sudo[186413]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:14 compute-1 sudo[186577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwgbuabckupgqotkrldgvnizhrhwhsei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548273.9318743-280-59292359264381/AnsiballZ_group.py'
Jan 27 21:11:14 compute-1 sudo[186577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:14 compute-1 podman[186540]: 2026-01-27 21:11:14.360703131 +0000 UTC m=+0.173265642 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126)
Jan 27 21:11:14 compute-1 python3.9[186587]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 27 21:11:14 compute-1 groupadd[186595]: group added to /etc/group: name=ceilometer, GID=42405
Jan 27 21:11:14 compute-1 groupadd[186595]: group added to /etc/gshadow: name=ceilometer
Jan 27 21:11:14 compute-1 groupadd[186595]: new group: name=ceilometer, GID=42405
Jan 27 21:11:14 compute-1 sudo[186577]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:15 compute-1 sudo[186750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbgddwttllgjahozfojxqrdkjocexkpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548274.8179355-296-52174199678039/AnsiballZ_user.py'
Jan 27 21:11:15 compute-1 sudo[186750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:15 compute-1 python3.9[186752]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 27 21:11:15 compute-1 useradd[186754]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 27 21:11:15 compute-1 useradd[186754]: add 'ceilometer' to group 'libvirt'
Jan 27 21:11:15 compute-1 useradd[186754]: add 'ceilometer' to shadow group 'libvirt'
Jan 27 21:11:15 compute-1 sudo[186750]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:17 compute-1 python3.9[186910]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:18 compute-1 podman[186958]: 2026-01-27 21:11:18.753688474 +0000 UTC m=+0.061483172 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 27 21:11:19 compute-1 python3.9[187050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769548277.2843227-348-229951663436095/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:19 compute-1 python3.9[187200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:20 compute-1 python3.9[187321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769548279.310228-348-179350546379117/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:20 compute-1 python3.9[187471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:21 compute-1 python3.9[187592]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769548280.4723065-348-268168280921993/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:22 compute-1 python3.9[187742]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:23 compute-1 python3.9[187894]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:23 compute-1 python3.9[188046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:24 compute-1 python3.9[188167]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548283.3811018-466-165198481744154/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:11:25 compute-1 python3.9[188317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:25 compute-1 python3.9[188438]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548284.68148-466-210214662900295/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:11:26 compute-1 python3.9[188588]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:27 compute-1 python3.9[188709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548285.959287-525-14451289705256/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:11:28 compute-1 python3.9[188859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:29 compute-1 python3.9[188980]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548288.1326966-556-150359189617671/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:31 compute-1 python3.9[189130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:32 compute-1 python3.9[189251]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548289.3521998-586-135828797969667/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:33 compute-1 python3.9[189401]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:33 compute-1 python3.9[189522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548292.5248883-616-29593246106764/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:34 compute-1 sudo[189672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsxilvcqoxczpblulvzmafphdfyzcxnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548293.9157782-646-71152284327802/AnsiballZ_file.py'
Jan 27 21:11:34 compute-1 sudo[189672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:34 compute-1 python3.9[189674]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:34 compute-1 sudo[189672]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:35 compute-1 sudo[189824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnlwsoiuyeqwwdrtljkayilhtmbcqqgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548294.6878417-662-218032615362718/AnsiballZ_file.py'
Jan 27 21:11:35 compute-1 sudo[189824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:35 compute-1 python3.9[189826]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:35 compute-1 sudo[189824]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:35 compute-1 python3.9[189976]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:36 compute-1 python3.9[190128]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:37 compute-1 python3.9[190280]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:38 compute-1 sudo[190432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzbvpduxykpednlljmdvlkczhpmjgsjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548297.8040006-726-6440824796392/AnsiballZ_file.py'
Jan 27 21:11:38 compute-1 sudo[190432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:38 compute-1 python3.9[190434]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:11:38 compute-1 sudo[190432]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:38 compute-1 sudo[190584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgzrmyktjxnkotqhaexhnmuoiarvoiqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548298.5662088-742-55297141501966/AnsiballZ_systemd_service.py'
Jan 27 21:11:38 compute-1 sudo[190584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:39 compute-1 python3.9[190586]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:11:39 compute-1 systemd[1]: Reloading.
Jan 27 21:11:39 compute-1 systemd-rc-local-generator[190612]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:11:39 compute-1 systemd-sysv-generator[190616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:11:39 compute-1 systemd[1]: Listening on Podman API Socket.
Jan 27 21:11:39 compute-1 sudo[190584]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:40 compute-1 sudo[190775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwgwcobzhgczukqgdrmzmbidzbufhehm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548299.9925914-760-249615665766746/AnsiballZ_stat.py'
Jan 27 21:11:40 compute-1 sudo[190775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:41 compute-1 python3.9[190777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:11:41 compute-1 sudo[190775]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:41 compute-1 sudo[190898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohkkwpumqwzpkjjovpzmbsqrwtuhdltn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548299.9925914-760-249615665766746/AnsiballZ_copy.py'
Jan 27 21:11:41 compute-1 sudo[190898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:41 compute-1 python3.9[190900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548299.9925914-760-249615665766746/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:11:41 compute-1 sudo[190898]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:43 compute-1 sudo[191051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aimmnescunadaochucfufiufkkeodlae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548303.0161572-802-98445628277040/AnsiballZ_file.py'
Jan 27 21:11:43 compute-1 sudo[191051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:43 compute-1 python3.9[191053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:43 compute-1 sudo[191051]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:44 compute-1 sudo[191203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dydraaiutqqifkhmmoegbxlwmjiklbui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548303.8169441-818-69709992341238/AnsiballZ_file.py'
Jan 27 21:11:44 compute-1 sudo[191203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:44 compute-1 python3.9[191205]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:11:44 compute-1 sudo[191203]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:44 compute-1 podman[191243]: 2026-01-27 21:11:44.831781757 +0000 UTC m=+0.126585614 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 27 21:11:45 compute-1 python3.9[191381]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.152 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.152 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.669 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.864 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.866 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.891 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.892 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6136MB free_disk=73.3498420715332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.892 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:11:46 compute-1 nova_compute[183751]: 2026-01-27 21:11:46.893 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:11:47 compute-1 sudo[191803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znalsfvydnuocpuqjvehdxhospjgtiiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548307.3201082-886-5891280827217/AnsiballZ_container_config_data.py'
Jan 27 21:11:47 compute-1 sudo[191803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:47 compute-1 python3.9[191805]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 27 21:11:48 compute-1 sudo[191803]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:48 compute-1 nova_compute[183751]: 2026-01-27 21:11:48.038 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:11:48 compute-1 nova_compute[183751]: 2026-01-27 21:11:48.039 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:11:46 up  1:14,  0 user,  load average: 0.78, 0.72, 0.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:11:48 compute-1 nova_compute[183751]: 2026-01-27 21:11:48.063 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:11:48 compute-1 nova_compute[183751]: 2026-01-27 21:11:48.574 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:11:48 compute-1 sudo[191967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmtvumghatxtyyixrjzrapyrpvvegyrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548308.4200666-908-172342200548978/AnsiballZ_container_config_hash.py'
Jan 27 21:11:48 compute-1 sudo[191967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:48 compute-1 podman[191929]: 2026-01-27 21:11:48.912857118 +0000 UTC m=+0.098233973 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Jan 27 21:11:49 compute-1 python3.9[191975]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 27 21:11:49 compute-1 nova_compute[183751]: 2026-01-27 21:11:49.086 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:11:49 compute-1 nova_compute[183751]: 2026-01-27 21:11:49.086 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.193s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:11:49 compute-1 sudo[191967]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:50 compute-1 sudo[192125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrlftfphevlqrkikbuqmgjmmygwtdzld ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769548309.516881-928-135779720737300/AnsiballZ_edpm_container_manage.py'
Jan 27 21:11:50 compute-1 sudo[192125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:50 compute-1 python3[192127]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 27 21:11:51 compute-1 podman[192141]: 2026-01-27 21:11:51.801333413 +0000 UTC m=+1.147011435 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 27 21:11:51 compute-1 podman[192240]: 2026-01-27 21:11:51.950276592 +0000 UTC m=+0.050432052 container create 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:11:51 compute-1 podman[192240]: 2026-01-27 21:11:51.925550062 +0000 UTC m=+0.025705542 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 27 21:11:51 compute-1 python3[192127]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 27 21:11:52 compute-1 sudo[192125]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:52 compute-1 sudo[192427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apocwzwktawzsofjtzhamrqnwcdkzegl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548312.3555984-944-180228943407953/AnsiballZ_stat.py'
Jan 27 21:11:52 compute-1 sudo[192427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:52 compute-1 python3.9[192429]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:52 compute-1 sudo[192427]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:54 compute-1 sudo[192581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erjnyfyplbhlgfwrreyebzxpqfwporda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548313.2364674-962-33271961393810/AnsiballZ_file.py'
Jan 27 21:11:54 compute-1 sudo[192581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:54 compute-1 python3.9[192583]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:54 compute-1 sudo[192581]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:54 compute-1 sudo[192657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jppitqicpfrzecuutscwuttdvubczpib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548313.2364674-962-33271961393810/AnsiballZ_stat.py'
Jan 27 21:11:54 compute-1 sudo[192657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:55 compute-1 python3.9[192659]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:11:55 compute-1 sudo[192657]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:55 compute-1 sudo[192808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aizwkwsmeriumvieghtwdztcuezauybe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548315.0778105-962-21882166090337/AnsiballZ_copy.py'
Jan 27 21:11:55 compute-1 sudo[192808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:55 compute-1 python3.9[192810]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769548315.0778105-962-21882166090337/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:11:55 compute-1 sudo[192808]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:56 compute-1 sudo[192884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjumtwaunzoqijpihirngdmkccovrypd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548315.0778105-962-21882166090337/AnsiballZ_systemd.py'
Jan 27 21:11:56 compute-1 sudo[192884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:56 compute-1 python3.9[192886]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:11:56 compute-1 systemd[1]: Reloading.
Jan 27 21:11:56 compute-1 systemd-rc-local-generator[192906]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:11:56 compute-1 systemd-sysv-generator[192909]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:11:57 compute-1 sudo[192884]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:57 compute-1 sudo[192995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cojxrihqpqbouolnpwggfmtcoykuosqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548315.0778105-962-21882166090337/AnsiballZ_systemd.py'
Jan 27 21:11:57 compute-1 sudo[192995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:11:57 compute-1 python3.9[192997]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:11:57 compute-1 systemd[1]: Reloading.
Jan 27 21:11:57 compute-1 systemd-sysv-generator[193032]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:11:57 compute-1 systemd-rc-local-generator[193028]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:11:58 compute-1 systemd[1]: Starting podman_exporter container...
Jan 27 21:11:58 compute-1 systemd[1]: Started libcrun container.
Jan 27 21:11:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dba997ce8997c8c98d78dba1266f379b4d576ec936a5e073d7a807e42bdf2760/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 27 21:11:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dba997ce8997c8c98d78dba1266f379b4d576ec936a5e073d7a807e42bdf2760/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 27 21:11:58 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083.
Jan 27 21:11:58 compute-1 podman[193038]: 2026-01-27 21:11:58.227095946 +0000 UTC m=+0.138352510 container init 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:11:58 compute-1 podman_exporter[193053]: ts=2026-01-27T21:11:58.254Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 27 21:11:58 compute-1 podman_exporter[193053]: ts=2026-01-27T21:11:58.254Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 27 21:11:58 compute-1 podman_exporter[193053]: ts=2026-01-27T21:11:58.254Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 27 21:11:58 compute-1 podman_exporter[193053]: ts=2026-01-27T21:11:58.254Z caller=handler.go:105 level=info collector=container
Jan 27 21:11:58 compute-1 podman[193038]: 2026-01-27 21:11:58.258159218 +0000 UTC m=+0.169415712 container start 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:11:58 compute-1 systemd[1]: Starting Podman API Service...
Jan 27 21:11:58 compute-1 systemd[1]: Started Podman API Service.
Jan 27 21:11:58 compute-1 podman[193064]: time="2026-01-27T21:11:58Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 27 21:11:58 compute-1 podman[193064]: time="2026-01-27T21:11:58Z" level=info msg="Setting parallel job count to 25"
Jan 27 21:11:58 compute-1 podman[193064]: time="2026-01-27T21:11:58Z" level=info msg="Using sqlite as database backend"
Jan 27 21:11:58 compute-1 podman[193038]: podman_exporter
Jan 27 21:11:58 compute-1 podman[193064]: time="2026-01-27T21:11:58Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 27 21:11:58 compute-1 podman[193064]: time="2026-01-27T21:11:58Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 27 21:11:58 compute-1 podman[193064]: time="2026-01-27T21:11:58Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 27 21:11:58 compute-1 systemd[1]: Started podman_exporter container.
Jan 27 21:11:58 compute-1 podman[193064]: @ - - [27/Jan/2026:21:11:58 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 27 21:11:58 compute-1 podman[193064]: time="2026-01-27T21:11:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:11:58 compute-1 sudo[192995]: pam_unix(sudo:session): session closed for user root
Jan 27 21:11:58 compute-1 podman[193064]: @ - - [27/Jan/2026:21:11:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12118 "" "Go-http-client/1.1"
Jan 27 21:11:58 compute-1 podman_exporter[193053]: ts=2026-01-27T21:11:58.345Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 27 21:11:58 compute-1 podman_exporter[193053]: ts=2026-01-27T21:11:58.346Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 27 21:11:58 compute-1 podman_exporter[193053]: ts=2026-01-27T21:11:58.347Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 27 21:11:58 compute-1 podman[193062]: 2026-01-27 21:11:58.351286404 +0000 UTC m=+0.071444537 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:11:58 compute-1 systemd[1]: 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083-701d4f77c98bd0bb.service: Main process exited, code=exited, status=1/FAILURE
Jan 27 21:11:58 compute-1 systemd[1]: 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083-701d4f77c98bd0bb.service: Failed with result 'exit-code'.
Jan 27 21:11:59 compute-1 python3.9[193245]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 27 21:12:00 compute-1 sudo[193395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxwbinvnsdoijtxjgnivsdntnwgqdqpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548320.4845772-1052-243046270116774/AnsiballZ_stat.py'
Jan 27 21:12:00 compute-1 sudo[193395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:00 compute-1 python3.9[193397]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:12:01 compute-1 sudo[193395]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:01 compute-1 sudo[193520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrcczckqotqkthiyvyxdhniyilzrxzik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548320.4845772-1052-243046270116774/AnsiballZ_copy.py'
Jan 27 21:12:01 compute-1 sudo[193520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:01 compute-1 python3.9[193522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548320.4845772-1052-243046270116774/.source.yaml _original_basename=.wpiuihn7 follow=False checksum=fe5062e80c43dae5bdc918c4b4b1917f3d4c407e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:01 compute-1 sudo[193520]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:02 compute-1 sudo[193672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjanqnvntkoidcognozznhkxpbgschwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548321.9371827-1082-214257563222866/AnsiballZ_stat.py'
Jan 27 21:12:02 compute-1 sudo[193672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:02 compute-1 python3.9[193674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:12:02 compute-1 sudo[193672]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:02 compute-1 sudo[193795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thuxwjfwaselxyshunyjkhbdnxeeiqxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548321.9371827-1082-214257563222866/AnsiballZ_copy.py'
Jan 27 21:12:02 compute-1 sudo[193795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:03 compute-1 python3.9[193797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769548321.9371827-1082-214257563222866/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:12:03 compute-1 sudo[193795]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:04 compute-1 sudo[193947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysojiuxxsekwyvkunnxuzywgyqzskdtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548323.775126-1124-250204576959371/AnsiballZ_file.py'
Jan 27 21:12:04 compute-1 sudo[193947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:04 compute-1 python3.9[193949]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:04 compute-1 sudo[193947]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:04 compute-1 sudo[194099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtsuqmyugcqcommrhuwuydcuytpfmspr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548324.5982397-1140-66432511474132/AnsiballZ_file.py'
Jan 27 21:12:04 compute-1 sudo[194099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:05 compute-1 python3.9[194101]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 27 21:12:05 compute-1 sudo[194099]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:05 compute-1 python3.9[194251]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:08 compute-1 sudo[194672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khvqlxzhjfvisjwdpbahzvdaoytzhedn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548328.3584628-1208-181744355222179/AnsiballZ_container_config_data.py'
Jan 27 21:12:08 compute-1 sudo[194672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:08 compute-1 python3.9[194674]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 27 21:12:08 compute-1 sudo[194672]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:10 compute-1 sudo[194824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsmglkbbinlxqyswuffedijhjrygdshe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548329.8775728-1230-53880444737895/AnsiballZ_container_config_hash.py'
Jan 27 21:12:10 compute-1 sudo[194824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:10 compute-1 python3.9[194826]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 27 21:12:10 compute-1 sudo[194824]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:11 compute-1 sudo[194976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyieckrgxtvzmqibslqkfzvzcugxlnzc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769548330.7784073-1250-276118920512749/AnsiballZ_edpm_container_manage.py'
Jan 27 21:12:11 compute-1 sudo[194976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:12:11.152 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:12:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:12:11.153 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:12:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:12:11.153 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:12:11 compute-1 python3[194978]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 27 21:12:13 compute-1 podman[194990]: 2026-01-27 21:12:13.69307921 +0000 UTC m=+2.087642621 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 27 21:12:13 compute-1 podman[195088]: 2026-01-27 21:12:13.817550805 +0000 UTC m=+0.046482952 container create 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Jan 27 21:12:13 compute-1 podman[195088]: 2026-01-27 21:12:13.792677043 +0000 UTC m=+0.021609210 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 27 21:12:13 compute-1 python3[194978]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 27 21:12:13 compute-1 sudo[194976]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:14 compute-1 sudo[195277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpdslknlgdhxnubqoqzfebiazmaxkrvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548334.401216-1266-274146201677876/AnsiballZ_stat.py'
Jan 27 21:12:14 compute-1 sudo[195277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:14 compute-1 python3.9[195279]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:12:14 compute-1 sudo[195277]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:15 compute-1 sudo[195448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khjseoctsrnjnkvyvtrbfofccqzpwmzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548335.2330234-1284-119318467635900/AnsiballZ_file.py'
Jan 27 21:12:15 compute-1 sudo[195448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:15 compute-1 podman[195405]: 2026-01-27 21:12:15.664260797 +0000 UTC m=+0.124470277 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Jan 27 21:12:15 compute-1 python3.9[195456]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:15 compute-1 sudo[195448]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:16 compute-1 sudo[195534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyssnqowtsyulyseibonxxqhjvdicnig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548335.2330234-1284-119318467635900/AnsiballZ_stat.py'
Jan 27 21:12:16 compute-1 sudo[195534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:16 compute-1 python3.9[195536]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:12:16 compute-1 sudo[195534]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:16 compute-1 sudo[195685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdvysvckqfntlyiqqmwcwnktrsyefmiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548336.351796-1284-37133060520059/AnsiballZ_copy.py'
Jan 27 21:12:16 compute-1 sudo[195685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:17 compute-1 python3.9[195687]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769548336.351796-1284-37133060520059/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:17 compute-1 sudo[195685]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:17 compute-1 sudo[195761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzeibbmcabvbdcysiugilaijwbtslfgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548336.351796-1284-37133060520059/AnsiballZ_systemd.py'
Jan 27 21:12:17 compute-1 sudo[195761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:17 compute-1 python3.9[195763]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 27 21:12:17 compute-1 systemd[1]: Reloading.
Jan 27 21:12:17 compute-1 systemd-rc-local-generator[195787]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:12:17 compute-1 systemd-sysv-generator[195791]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:12:18 compute-1 sudo[195761]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:18 compute-1 sudo[195871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewaofiyzpwduppgncedotqyikkxyudsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548336.351796-1284-37133060520059/AnsiballZ_systemd.py'
Jan 27 21:12:18 compute-1 sudo[195871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:18 compute-1 python3.9[195873]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 27 21:12:18 compute-1 systemd[1]: Reloading.
Jan 27 21:12:18 compute-1 systemd-sysv-generator[195908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 27 21:12:18 compute-1 systemd-rc-local-generator[195903]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 27 21:12:19 compute-1 systemd[1]: Starting openstack_network_exporter container...
Jan 27 21:12:19 compute-1 systemd[1]: Started libcrun container.
Jan 27 21:12:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58b6fd7ff5282da4fcaf5a378d1606d7abba54a70c4c881a43134cebbe3c58af/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 27 21:12:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58b6fd7ff5282da4fcaf5a378d1606d7abba54a70c4c881a43134cebbe3c58af/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 27 21:12:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58b6fd7ff5282da4fcaf5a378d1606d7abba54a70c4c881a43134cebbe3c58af/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 27 21:12:19 compute-1 podman[195912]: 2026-01-27 21:12:19.150031082 +0000 UTC m=+0.102686844 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest)
Jan 27 21:12:19 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d.
Jan 27 21:12:19 compute-1 podman[195914]: 2026-01-27 21:12:19.185225108 +0000 UTC m=+0.140434298 container init 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *bridge.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *coverage.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *datapath.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *iface.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *memory.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *ovn.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *pmd_perf.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *pmd_rxq.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: INFO    21:12:19 main.go:48: registering *vswitch.Collector
Jan 27 21:12:19 compute-1 openstack_network_exporter[195945]: NOTICE  21:12:19 main.go:76: listening on https://:9105/metrics
Jan 27 21:12:19 compute-1 podman[195914]: 2026-01-27 21:12:19.210028549 +0000 UTC m=+0.165237639 container start 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 21:12:19 compute-1 podman[195914]: openstack_network_exporter
Jan 27 21:12:19 compute-1 systemd[1]: Started openstack_network_exporter container.
Jan 27 21:12:19 compute-1 sudo[195871]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:19 compute-1 podman[195958]: 2026-01-27 21:12:19.316837185 +0000 UTC m=+0.094438587 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter)
Jan 27 21:12:20 compute-1 python3.9[196131]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 27 21:12:21 compute-1 sudo[196281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liwlbzrdmktpxcdzycwxmfyhlzccvndf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548341.1591854-1374-252965200914010/AnsiballZ_stat.py'
Jan 27 21:12:21 compute-1 sudo[196281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:21 compute-1 python3.9[196283]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:12:21 compute-1 sudo[196281]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:22 compute-1 sudo[196406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mktvyzyndykhapwtlpywudqkxtswzlbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548341.1591854-1374-252965200914010/AnsiballZ_copy.py'
Jan 27 21:12:22 compute-1 sudo[196406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:22 compute-1 python3.9[196408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548341.1591854-1374-252965200914010/.source.yaml _original_basename=.gqfwp7ct follow=False checksum=c7753f297e665a5c2f5b50d212db468b16f7523c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:22 compute-1 sudo[196406]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:22 compute-1 auditd[702]: Audit daemon rotating log files
Jan 27 21:12:23 compute-1 sudo[196558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qekooqpsgcfldeyluxbwqyryuoacobhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548342.8562229-1404-137683256321455/AnsiballZ_find.py'
Jan 27 21:12:23 compute-1 sudo[196558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:23 compute-1 python3.9[196560]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 27 21:12:23 compute-1 sudo[196558]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:28 compute-1 podman[196585]: 2026-01-27 21:12:28.753775667 +0000 UTC m=+0.060727379 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:12:40 compute-1 sudo[196734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgcettftqauhtennvaxrobwpysugstux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548359.968307-1554-259150166032723/AnsiballZ_podman_container_info.py'
Jan 27 21:12:40 compute-1 sudo[196734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:40 compute-1 python3.9[196736]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 27 21:12:40 compute-1 sudo[196734]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:41 compute-1 sudo[196899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrkqdyspoacrwoaztfburddfcgguiipw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548360.9040554-1562-236431906989557/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:41 compute-1 sudo[196899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:41 compute-1 python3.9[196901]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:41 compute-1 systemd[1]: Started libpod-conmon-0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08.scope.
Jan 27 21:12:41 compute-1 podman[196902]: 2026-01-27 21:12:41.595163691 +0000 UTC m=+0.098435234 container exec 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 21:12:41 compute-1 podman[196902]: 2026-01-27 21:12:41.631399111 +0000 UTC m=+0.134670674 container exec_died 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 21:12:41 compute-1 systemd[1]: libpod-conmon-0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08.scope: Deactivated successfully.
Jan 27 21:12:41 compute-1 sudo[196899]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:42 compute-1 sudo[197084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfvfkzqsazsboljplfexstkusuafpfca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548361.9244866-1570-165401194050475/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:42 compute-1 sudo[197084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:42 compute-1 python3.9[197086]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:42 compute-1 systemd[1]: Started libpod-conmon-0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08.scope.
Jan 27 21:12:42 compute-1 podman[197087]: 2026-01-27 21:12:42.896378383 +0000 UTC m=+0.109738375 container exec 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:12:42 compute-1 podman[197087]: 2026-01-27 21:12:42.928641687 +0000 UTC m=+0.142001639 container exec_died 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:12:42 compute-1 systemd[1]: libpod-conmon-0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08.scope: Deactivated successfully.
Jan 27 21:12:42 compute-1 sudo[197084]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:43 compute-1 sudo[197270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpbmnrwsqgwouxxorwhzcgufzodsurtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548363.128845-1578-25905084345019/AnsiballZ_file.py'
Jan 27 21:12:43 compute-1 sudo[197270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:43 compute-1 python3.9[197272]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:43 compute-1 sudo[197270]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:44 compute-1 sudo[197422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkhstbgftiekkreulhxhweoeqdrrnjch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548363.8493054-1587-192052384862341/AnsiballZ_podman_container_info.py'
Jan 27 21:12:44 compute-1 sudo[197422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:44 compute-1 python3.9[197424]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 27 21:12:44 compute-1 sudo[197422]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:45 compute-1 sudo[197587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwhddcikhgwxcylirjunzpnllpxhiiia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548364.8113968-1595-216214491287005/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:45 compute-1 sudo[197587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:45 compute-1 python3.9[197589]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:45 compute-1 systemd[1]: Started libpod-conmon-6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d.scope.
Jan 27 21:12:45 compute-1 podman[197590]: 2026-01-27 21:12:45.49737069 +0000 UTC m=+0.083888684 container exec 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Jan 27 21:12:45 compute-1 podman[197590]: 2026-01-27 21:12:45.531230383 +0000 UTC m=+0.117748357 container exec_died 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:12:45 compute-1 systemd[1]: libpod-conmon-6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d.scope: Deactivated successfully.
Jan 27 21:12:45 compute-1 sudo[197587]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:46 compute-1 sudo[197783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpdyvalbgexfydscjheebmiyqohayias ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548365.8033838-1603-209642704893679/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:46 compute-1 sudo[197783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:46 compute-1 podman[197746]: 2026-01-27 21:12:46.232587947 +0000 UTC m=+0.120318429 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:12:46 compute-1 python3.9[197789]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:46 compute-1 systemd[1]: Started libpod-conmon-6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d.scope.
Jan 27 21:12:46 compute-1 podman[197799]: 2026-01-27 21:12:46.511045901 +0000 UTC m=+0.092655475 container exec 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:12:46 compute-1 podman[197799]: 2026-01-27 21:12:46.545503348 +0000 UTC m=+0.127112932 container exec_died 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 27 21:12:46 compute-1 sudo[197783]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:46 compute-1 systemd[1]: libpod-conmon-6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d.scope: Deactivated successfully.
Jan 27 21:12:47 compute-1 sudo[197978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swkjjnfygjaaomfpnibdzjdtvtostytj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548366.743665-1611-223655548564062/AnsiballZ_file.py'
Jan 27 21:12:47 compute-1 sudo[197978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:47 compute-1 python3.9[197980]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:47 compute-1 sudo[197978]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:48 compute-1 sudo[198130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lemfiigbzusakxpidhhbwicehyraynah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548367.7186377-1620-165082303207377/AnsiballZ_podman_container_info.py'
Jan 27 21:12:48 compute-1 sudo[198130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:48 compute-1 python3.9[198132]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 27 21:12:48 compute-1 sudo[198130]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:48 compute-1 sudo[198295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpkzncymefdolnpkhunvophorbbdzqvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548368.5815427-1628-194598190592904/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:48 compute-1 sudo[198295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.079 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.081 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 python3.9[198297]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:49 compute-1 systemd[1]: Started libpod-conmon-33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083.scope.
Jan 27 21:12:49 compute-1 podman[198298]: 2026-01-27 21:12:49.255485741 +0000 UTC m=+0.072253925 container exec 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:12:49 compute-1 podman[198298]: 2026-01-27 21:12:49.290344998 +0000 UTC m=+0.107113182 container exec_died 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:12:49 compute-1 systemd[1]: libpod-conmon-33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083.scope: Deactivated successfully.
Jan 27 21:12:49 compute-1 sudo[198295]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:49 compute-1 podman[198316]: 2026-01-27 21:12:49.350172754 +0000 UTC m=+0.080197116 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 27 21:12:49 compute-1 podman[198347]: 2026-01-27 21:12:49.431025074 +0000 UTC m=+0.052946191 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.592 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.593 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.593 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.594 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.594 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.594 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.595 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:12:49 compute-1 nova_compute[183751]: 2026-01-27 21:12:49.595 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:12:49 compute-1 sudo[198517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieeapltdqslddvxfsqkqbsjygukkztvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548369.5166874-1636-25886542129728/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:49 compute-1 sudo[198517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:50 compute-1 python3.9[198519]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:50 compute-1 systemd[1]: Started libpod-conmon-33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083.scope.
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.109 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.109 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.110 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.110 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:12:50 compute-1 podman[198520]: 2026-01-27 21:12:50.120866742 +0000 UTC m=+0.084628212 container exec 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:12:50 compute-1 podman[198520]: 2026-01-27 21:12:50.156275472 +0000 UTC m=+0.120036892 container exec_died 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:12:50 compute-1 systemd[1]: libpod-conmon-33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083.scope: Deactivated successfully.
Jan 27 21:12:50 compute-1 sudo[198517]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.302 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.304 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.325 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.326 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5937MB free_disk=73.18145370483398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.326 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:12:50 compute-1 nova_compute[183751]: 2026-01-27 21:12:50.327 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:12:50 compute-1 sudo[198703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bygeghcgihysbhuxidyotipwtenyqgyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548370.4049635-1644-55549704914451/AnsiballZ_file.py'
Jan 27 21:12:50 compute-1 sudo[198703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:51 compute-1 python3.9[198705]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:51 compute-1 sudo[198703]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:51 compute-1 nova_compute[183751]: 2026-01-27 21:12:51.382 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:12:51 compute-1 nova_compute[183751]: 2026-01-27 21:12:51.382 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:12:50 up  1:15,  0 user,  load average: 0.52, 0.65, 0.51\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:12:51 compute-1 nova_compute[183751]: 2026-01-27 21:12:51.404 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:12:51 compute-1 nova_compute[183751]: 2026-01-27 21:12:51.913 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:12:52 compute-1 sudo[198855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akwedifuzikwutzfsrlxoqmkcqfompbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548371.7369382-1653-122976653772728/AnsiballZ_podman_container_info.py'
Jan 27 21:12:52 compute-1 sudo[198855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:52 compute-1 python3.9[198857]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 27 21:12:52 compute-1 sudo[198855]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:52 compute-1 nova_compute[183751]: 2026-01-27 21:12:52.421 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:12:52 compute-1 nova_compute[183751]: 2026-01-27 21:12:52.422 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:12:52 compute-1 sudo[199021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqllpedgaantinpewunqcccsuzoanbzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548372.4582932-1661-256825479345138/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:52 compute-1 sudo[199021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:52 compute-1 python3.9[199023]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:53 compute-1 systemd[1]: Started libpod-conmon-0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d.scope.
Jan 27 21:12:53 compute-1 podman[199024]: 2026-01-27 21:12:53.064118595 +0000 UTC m=+0.070025102 container exec 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:12:53 compute-1 podman[199024]: 2026-01-27 21:12:53.069826572 +0000 UTC m=+0.075733069 container exec_died 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 21:12:53 compute-1 sudo[199021]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:53 compute-1 systemd[1]: libpod-conmon-0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d.scope: Deactivated successfully.
Jan 27 21:12:54 compute-1 sudo[199206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnqkufpekzjbfkffberceyyibjmvgsxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548374.0954657-1669-44337239001764/AnsiballZ_podman_container_exec.py'
Jan 27 21:12:54 compute-1 sudo[199206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:54 compute-1 python3.9[199208]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 27 21:12:54 compute-1 systemd[1]: Started libpod-conmon-0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d.scope.
Jan 27 21:12:54 compute-1 podman[199209]: 2026-01-27 21:12:54.731087226 +0000 UTC m=+0.090239367 container exec 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1755695350, distribution-scope=public, name=ubi9-minimal)
Jan 27 21:12:54 compute-1 podman[199209]: 2026-01-27 21:12:54.760785278 +0000 UTC m=+0.119937389 container exec_died 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Jan 27 21:12:54 compute-1 systemd[1]: libpod-conmon-0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d.scope: Deactivated successfully.
Jan 27 21:12:54 compute-1 sudo[199206]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:55 compute-1 sudo[199391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdercrksooyowcqdvnwzjnezavscnym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548375.0081708-1677-248547849096727/AnsiballZ_file.py'
Jan 27 21:12:55 compute-1 sudo[199391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:55 compute-1 python3.9[199393]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:55 compute-1 sudo[199391]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:56 compute-1 sudo[199543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmkgszwxanewuvjxcgrdjnclxreyrrez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548375.8734431-1688-166280038952616/AnsiballZ_file.py'
Jan 27 21:12:56 compute-1 sudo[199543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:56 compute-1 python3.9[199545]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:56 compute-1 sudo[199543]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:57 compute-1 sudo[199695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsqpgzlqkegtaduvpdlrmihfjcfiybit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548376.7331553-1704-41420444493320/AnsiballZ_stat.py'
Jan 27 21:12:57 compute-1 sudo[199695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:57 compute-1 python3.9[199697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:12:57 compute-1 sudo[199695]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:57 compute-1 sudo[199818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qasfiplvvvepiiotfsujtayfhyxgrext ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548376.7331553-1704-41420444493320/AnsiballZ_copy.py'
Jan 27 21:12:57 compute-1 sudo[199818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:57 compute-1 python3.9[199820]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769548376.7331553-1704-41420444493320/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:57 compute-1 sudo[199818]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:58 compute-1 sudo[199970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnuviosmepxhmyaxqyrytxtpazxmmtsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548378.1886902-1736-206995650825932/AnsiballZ_file.py'
Jan 27 21:12:58 compute-1 sudo[199970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:58 compute-1 python3.9[199972]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:12:58 compute-1 sudo[199970]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:59 compute-1 sudo[200134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncpkqhrythrcakcvhaxpckiqsugzykmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548378.966519-1752-256784950333408/AnsiballZ_stat.py'
Jan 27 21:12:59 compute-1 sudo[200134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:12:59 compute-1 podman[200096]: 2026-01-27 21:12:59.364991756 +0000 UTC m=+0.076499727 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:12:59 compute-1 python3.9[200140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:12:59 compute-1 sudo[200134]: pam_unix(sudo:session): session closed for user root
Jan 27 21:12:59 compute-1 sudo[200225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpltlyhvutefsfufppmduonixejuccib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548378.966519-1752-256784950333408/AnsiballZ_file.py'
Jan 27 21:12:59 compute-1 sudo[200225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:00 compute-1 python3.9[200227]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:00 compute-1 sudo[200225]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:00 compute-1 sudo[200377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itqmhvxmkghueazdhogvxuiamtzkpysu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548380.372723-1776-209287155892600/AnsiballZ_stat.py'
Jan 27 21:13:00 compute-1 sudo[200377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:00 compute-1 python3.9[200379]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:13:00 compute-1 sudo[200377]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:01 compute-1 sudo[200455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upfrvdlffdxvelglvxnejvsgmphvdshk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548380.372723-1776-209287155892600/AnsiballZ_file.py'
Jan 27 21:13:01 compute-1 sudo[200455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:01 compute-1 python3.9[200457]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.108p6ds9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:01 compute-1 sudo[200455]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:02 compute-1 sudo[200607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edfusxtixdsusfjsgjidxpowizuqweci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548381.9500937-1800-261178104962292/AnsiballZ_stat.py'
Jan 27 21:13:02 compute-1 sudo[200607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:02 compute-1 python3.9[200609]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:13:02 compute-1 sudo[200607]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:02 compute-1 sudo[200685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjsbvwevcleojhklvqkniwunmsvgsghx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548381.9500937-1800-261178104962292/AnsiballZ_file.py'
Jan 27 21:13:02 compute-1 sudo[200685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:03 compute-1 python3.9[200687]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:03 compute-1 sudo[200685]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:04 compute-1 sudo[200837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwkppuudxvircvykoiizfqqzzbwezhmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548384.2252917-1826-89032000625452/AnsiballZ_command.py'
Jan 27 21:13:04 compute-1 sudo[200837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:04 compute-1 python3.9[200839]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:13:04 compute-1 sudo[200837]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:05 compute-1 sudo[200990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgbswxrynorvedfvzstrsmdttcyiqdnl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769548384.9257803-1842-205036477792716/AnsiballZ_edpm_nftables_from_files.py'
Jan 27 21:13:05 compute-1 sudo[200990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:05 compute-1 python3[200992]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 27 21:13:05 compute-1 sudo[200990]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:07 compute-1 sudo[201142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuotuoxczzqmsytwhdzugvobkthkfedk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548386.881146-1858-88694267355894/AnsiballZ_stat.py'
Jan 27 21:13:07 compute-1 sudo[201142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:07 compute-1 python3.9[201144]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:13:07 compute-1 sudo[201142]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:07 compute-1 sudo[201220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcsurhlrngebasobohsubvmqoiiovvpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548386.881146-1858-88694267355894/AnsiballZ_file.py'
Jan 27 21:13:07 compute-1 sudo[201220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:07 compute-1 python3.9[201222]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:07 compute-1 sudo[201220]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:08 compute-1 sudo[201372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncothpdrixqsrzynrchbmpxldyskjkzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548388.1895368-1882-68489698968450/AnsiballZ_stat.py'
Jan 27 21:13:08 compute-1 sudo[201372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:08 compute-1 python3.9[201374]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:13:08 compute-1 sudo[201372]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:09 compute-1 sudo[201450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xryjxhrksrtmgnshfwqvlwazwqtrugsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548388.1895368-1882-68489698968450/AnsiballZ_file.py'
Jan 27 21:13:09 compute-1 sudo[201450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:09 compute-1 python3.9[201452]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:09 compute-1 sudo[201450]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:10 compute-1 sudo[201602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwhfwmltfmejkghqwsvchkciyhdhcbwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548389.6970031-1906-146691634565499/AnsiballZ_stat.py'
Jan 27 21:13:10 compute-1 sudo[201602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:10 compute-1 python3.9[201604]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:13:10 compute-1 sudo[201602]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:10 compute-1 sudo[201680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgqjizdbhldpgdjyfsnmeuytvnuhupgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548389.6970031-1906-146691634565499/AnsiballZ_file.py'
Jan 27 21:13:10 compute-1 sudo[201680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:10 compute-1 python3.9[201682]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:10 compute-1 sudo[201680]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:13:11.156 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:13:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:13:11.157 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:13:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:13:11.157 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:13:11 compute-1 sudo[201833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-albhjiazhnybbjbuxfpkkbeoncpixiik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548391.248389-1930-1820081464555/AnsiballZ_stat.py'
Jan 27 21:13:11 compute-1 sudo[201833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:11 compute-1 python3.9[201835]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:13:11 compute-1 sudo[201833]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:12 compute-1 sudo[201911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfeqvqowujucbnkzzrkqxanvheshrjmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548391.248389-1930-1820081464555/AnsiballZ_file.py'
Jan 27 21:13:12 compute-1 sudo[201911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:12 compute-1 python3.9[201913]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:12 compute-1 sudo[201911]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:12 compute-1 sudo[202063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwvhmlttsgjgwjlrywkhvutifxayddgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548392.5592937-1954-144008182714460/AnsiballZ_stat.py'
Jan 27 21:13:12 compute-1 sudo[202063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:13 compute-1 python3.9[202065]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 27 21:13:13 compute-1 sudo[202063]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:13 compute-1 sudo[202188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfqhosjwsxncckfwxlnlqnniddwcqroe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548392.5592937-1954-144008182714460/AnsiballZ_copy.py'
Jan 27 21:13:13 compute-1 sudo[202188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:13 compute-1 python3.9[202190]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769548392.5592937-1954-144008182714460/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:13 compute-1 sudo[202188]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:14 compute-1 sudo[202340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tognhwdxarczppyqtscyjfysbxlwcqza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548394.1149855-1984-48759299557857/AnsiballZ_file.py'
Jan 27 21:13:14 compute-1 sudo[202340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:14 compute-1 python3.9[202342]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:14 compute-1 sudo[202340]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:15 compute-1 sudo[202492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvvkamfiocjunwrildrtyywymstzlfsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548395.057587-2001-242197411486307/AnsiballZ_command.py'
Jan 27 21:13:15 compute-1 sudo[202492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:15 compute-1 python3.9[202494]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:13:15 compute-1 sudo[202492]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:16 compute-1 podman[202597]: 2026-01-27 21:13:16.806372189 +0000 UTC m=+0.114658973 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:13:16 compute-1 sudo[202673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnpksooujjoulbukqpqkteatyuuwahbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548395.8894494-2017-148882945129141/AnsiballZ_blockinfile.py'
Jan 27 21:13:17 compute-1 sudo[202673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:17 compute-1 python3.9[202675]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:17 compute-1 sudo[202673]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:19 compute-1 sudo[202825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhpiusstixottsdapadnbalnzhsktzmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548397.7581575-2034-72067002248800/AnsiballZ_command.py'
Jan 27 21:13:19 compute-1 sudo[202825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:19 compute-1 python3.9[202827]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:13:19 compute-1 sudo[202825]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:19 compute-1 podman[202931]: 2026-01-27 21:13:19.760193395 +0000 UTC m=+0.063706060 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7)
Jan 27 21:13:19 compute-1 podman[202942]: 2026-01-27 21:13:19.760277097 +0000 UTC m=+0.060493413 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 21:13:19 compute-1 sudo[203017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pamjazurveghxfmerhqoepfhhdvcetyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548399.4996722-2050-17833570239854/AnsiballZ_stat.py'
Jan 27 21:13:19 compute-1 sudo[203017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:19 compute-1 python3.9[203019]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 27 21:13:20 compute-1 sudo[203017]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:20 compute-1 sudo[203171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhrvcticgwkegwiupqbrtgxywqwvctzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548400.202918-2066-177795843606663/AnsiballZ_command.py'
Jan 27 21:13:20 compute-1 sudo[203171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:20 compute-1 python3.9[203173]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 27 21:13:20 compute-1 sudo[203171]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:21 compute-1 openstack_network_exporter[195945]: ERROR   21:13:21 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:13:21 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:13:21 compute-1 openstack_network_exporter[195945]: ERROR   21:13:21 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:13:21 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:13:21 compute-1 sudo[203330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kympbeqctfsuitywsnqtlaaufpyrjtgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769548401.0335033-2082-43886154510309/AnsiballZ_file.py'
Jan 27 21:13:21 compute-1 sudo[203330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 21:13:21 compute-1 python3.9[203332]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 27 21:13:21 compute-1 sudo[203330]: pam_unix(sudo:session): session closed for user root
Jan 27 21:13:22 compute-1 sshd-session[184073]: Connection closed by 192.168.122.30 port 38072
Jan 27 21:13:22 compute-1 sshd-session[184069]: pam_unix(sshd:session): session closed for user zuul
Jan 27 21:13:22 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Jan 27 21:13:22 compute-1 systemd[1]: session-27.scope: Consumed 1min 25.109s CPU time.
Jan 27 21:13:22 compute-1 systemd-logind[786]: Session 27 logged out. Waiting for processes to exit.
Jan 27 21:13:22 compute-1 systemd-logind[786]: Removed session 27.
Jan 27 21:13:29 compute-1 podman[203357]: 2026-01-27 21:13:29.75418916 +0000 UTC m=+0.063833443 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:13:35 compute-1 podman[193064]: time="2026-01-27T21:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:13:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:13:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Jan 27 21:13:47 compute-1 podman[203385]: 2026-01-27 21:13:47.78252013 +0000 UTC m=+0.089257358 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 27 21:13:49 compute-1 openstack_network_exporter[195945]: ERROR   21:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:13:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:13:49 compute-1 openstack_network_exporter[195945]: ERROR   21:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:13:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:13:50 compute-1 podman[203416]: 2026-01-27 21:13:50.763597971 +0000 UTC m=+0.060274938 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 27 21:13:50 compute-1 podman[203415]: 2026-01-27 21:13:50.783662983 +0000 UTC m=+0.090423597 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Jan 27 21:13:52 compute-1 sshd-session[203414]: Invalid user solana from 80.94.92.186 port 40130
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.425 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.426 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.426 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.426 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.426 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.426 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.427 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.427 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.427 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:13:52 compute-1 sshd-session[203414]: Connection closed by invalid user solana 80.94.92.186 port 40130 [preauth]
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.942 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.942 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.943 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:13:52 compute-1 nova_compute[183751]: 2026-01-27 21:13:52.943 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:13:53 compute-1 nova_compute[183751]: 2026-01-27 21:13:53.137 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:13:53 compute-1 nova_compute[183751]: 2026-01-27 21:13:53.138 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:13:53 compute-1 nova_compute[183751]: 2026-01-27 21:13:53.172 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:13:53 compute-1 nova_compute[183751]: 2026-01-27 21:13:53.173 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6031MB free_disk=73.18144226074219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:13:53 compute-1 nova_compute[183751]: 2026-01-27 21:13:53.173 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:13:53 compute-1 nova_compute[183751]: 2026-01-27 21:13:53.173 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:13:54 compute-1 nova_compute[183751]: 2026-01-27 21:13:54.246 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:13:54 compute-1 nova_compute[183751]: 2026-01-27 21:13:54.246 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:13:53 up  1:16,  0 user,  load average: 0.31, 0.58, 0.50\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:13:54 compute-1 nova_compute[183751]: 2026-01-27 21:13:54.274 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:13:54 compute-1 nova_compute[183751]: 2026-01-27 21:13:54.781 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:13:55 compute-1 nova_compute[183751]: 2026-01-27 21:13:55.300 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:13:55 compute-1 nova_compute[183751]: 2026-01-27 21:13:55.301 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:14:00 compute-1 podman[203455]: 2026-01-27 21:14:00.768059926 +0000 UTC m=+0.072646591 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:14:05 compute-1 podman[193064]: time="2026-01-27T21:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:14:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:14:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 27 21:14:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:14:11.158 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:14:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:14:11.158 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:14:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:14:11.158 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:14:18 compute-1 podman[203481]: 2026-01-27 21:14:18.793416216 +0000 UTC m=+0.099723764 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:14:19 compute-1 openstack_network_exporter[195945]: ERROR   21:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:14:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:14:19 compute-1 openstack_network_exporter[195945]: ERROR   21:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:14:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:14:21 compute-1 podman[203508]: 2026-01-27 21:14:21.77212292 +0000 UTC m=+0.070823137 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 27 21:14:21 compute-1 podman[203509]: 2026-01-27 21:14:21.7750106 +0000 UTC m=+0.074742622 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260126)
Jan 27 21:14:31 compute-1 podman[203547]: 2026-01-27 21:14:31.768294061 +0000 UTC m=+0.072180379 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:14:35 compute-1 podman[193064]: time="2026-01-27T21:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:14:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:14:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Jan 27 21:14:49 compute-1 openstack_network_exporter[195945]: ERROR   21:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:14:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:14:49 compute-1 openstack_network_exporter[195945]: ERROR   21:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:14:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:14:49 compute-1 podman[203572]: 2026-01-27 21:14:49.819530583 +0000 UTC m=+0.133727675 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.019 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.530 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.530 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.530 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.531 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.531 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.531 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.532 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:14:51 compute-1 nova_compute[183751]: 2026-01-27 21:14:51.532 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.047 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.048 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.048 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.048 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.241 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.242 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.255 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.256 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6059MB free_disk=73.18144226074219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.256 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:14:52 compute-1 nova_compute[183751]: 2026-01-27 21:14:52.256 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:14:52 compute-1 podman[203600]: 2026-01-27 21:14:52.765256994 +0000 UTC m=+0.062718690 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 27 21:14:52 compute-1 podman[203599]: 2026-01-27 21:14:52.784944081 +0000 UTC m=+0.079991018 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Jan 27 21:14:53 compute-1 nova_compute[183751]: 2026-01-27 21:14:53.303 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:14:53 compute-1 nova_compute[183751]: 2026-01-27 21:14:53.303 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:14:52 up  1:17,  0 user,  load average: 0.15, 0.48, 0.47\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:14:53 compute-1 nova_compute[183751]: 2026-01-27 21:14:53.325 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:14:53 compute-1 nova_compute[183751]: 2026-01-27 21:14:53.834 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:14:54 compute-1 nova_compute[183751]: 2026-01-27 21:14:54.343 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:14:54 compute-1 nova_compute[183751]: 2026-01-27 21:14:54.344 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:14:54 compute-1 nova_compute[183751]: 2026-01-27 21:14:54.469 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:02 compute-1 podman[203640]: 2026-01-27 21:15:02.748710172 +0000 UTC m=+0.067448838 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:15:05 compute-1 podman[193064]: time="2026-01-27T21:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:15:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:15:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Jan 27 21:15:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:15:11.159 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:15:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:15:11.160 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:15:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:15:11.160 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:15:19 compute-1 openstack_network_exporter[195945]: ERROR   21:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:15:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:15:19 compute-1 openstack_network_exporter[195945]: ERROR   21:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:15:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:15:20 compute-1 podman[203665]: 2026-01-27 21:15:20.820722143 +0000 UTC m=+0.121705908 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Jan 27 21:15:23 compute-1 podman[203692]: 2026-01-27 21:15:23.749638408 +0000 UTC m=+0.055420140 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 27 21:15:23 compute-1 podman[203691]: 2026-01-27 21:15:23.749840183 +0000 UTC m=+0.060500466 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64)
Jan 27 21:15:33 compute-1 podman[203731]: 2026-01-27 21:15:33.786237559 +0000 UTC m=+0.100638638 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:15:35 compute-1 podman[193064]: time="2026-01-27T21:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:15:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:15:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 27 21:15:46 compute-1 nova_compute[183751]: 2026-01-27 21:15:46.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:46 compute-1 nova_compute[183751]: 2026-01-27 21:15:46.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:15:46 compute-1 nova_compute[183751]: 2026-01-27 21:15:46.656 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:15:46 compute-1 nova_compute[183751]: 2026-01-27 21:15:46.657 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:46 compute-1 nova_compute[183751]: 2026-01-27 21:15:46.657 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:15:47 compute-1 nova_compute[183751]: 2026-01-27 21:15:47.164 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:48 compute-1 nova_compute[183751]: 2026-01-27 21:15:48.674 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:48 compute-1 nova_compute[183751]: 2026-01-27 21:15:48.674 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.194 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.195 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.195 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.195 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.341 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.341 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.368 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.369 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6097MB free_disk=73.18144226074219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.369 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:15:49 compute-1 nova_compute[183751]: 2026-01-27 21:15:49.369 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:15:49 compute-1 openstack_network_exporter[195945]: ERROR   21:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:15:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:15:49 compute-1 openstack_network_exporter[195945]: ERROR   21:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:15:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:15:50 compute-1 nova_compute[183751]: 2026-01-27 21:15:50.533 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:15:50 compute-1 nova_compute[183751]: 2026-01-27 21:15:50.533 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:15:49 up  1:18,  0 user,  load average: 0.28, 0.46, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:15:50 compute-1 nova_compute[183751]: 2026-01-27 21:15:50.568 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:15:51 compute-1 nova_compute[183751]: 2026-01-27 21:15:51.075 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:15:51 compute-1 nova_compute[183751]: 2026-01-27 21:15:51.585 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:15:51 compute-1 nova_compute[183751]: 2026-01-27 21:15:51.585 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.216s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:15:51 compute-1 podman[203756]: 2026-01-27 21:15:51.819672516 +0000 UTC m=+0.119907014 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 27 21:15:52 compute-1 nova_compute[183751]: 2026-01-27 21:15:52.059 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:52 compute-1 nova_compute[183751]: 2026-01-27 21:15:52.060 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:52 compute-1 nova_compute[183751]: 2026-01-27 21:15:52.060 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:52 compute-1 nova_compute[183751]: 2026-01-27 21:15:52.061 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:52 compute-1 nova_compute[183751]: 2026-01-27 21:15:52.061 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:52 compute-1 nova_compute[183751]: 2026-01-27 21:15:52.061 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:15:52 compute-1 nova_compute[183751]: 2026-01-27 21:15:52.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:15:54 compute-1 podman[203783]: 2026-01-27 21:15:54.781351428 +0000 UTC m=+0.076726170 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal)
Jan 27 21:15:54 compute-1 podman[203784]: 2026-01-27 21:15:54.790114016 +0000 UTC m=+0.077269533 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 27 21:16:04 compute-1 podman[203821]: 2026-01-27 21:16:04.786695862 +0000 UTC m=+0.086776739 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:16:05 compute-1 podman[193064]: time="2026-01-27T21:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:16:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:16:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 27 21:16:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:16:11.161 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:16:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:16:11.161 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:16:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:16:11.161 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:16:19 compute-1 openstack_network_exporter[195945]: ERROR   21:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:16:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:16:19 compute-1 openstack_network_exporter[195945]: ERROR   21:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:16:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:16:22 compute-1 podman[203846]: 2026-01-27 21:16:22.781722262 +0000 UTC m=+0.093405124 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:16:25 compute-1 podman[203874]: 2026-01-27 21:16:25.743242241 +0000 UTC m=+0.057843490 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Jan 27 21:16:25 compute-1 podman[203875]: 2026-01-27 21:16:25.76693496 +0000 UTC m=+0.080353799 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 27 21:16:35 compute-1 podman[193064]: time="2026-01-27T21:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:16:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:16:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 27 21:16:35 compute-1 podman[203911]: 2026-01-27 21:16:35.747239979 +0000 UTC m=+0.063629354 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:16:48 compute-1 nova_compute[183751]: 2026-01-27 21:16:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:48 compute-1 nova_compute[183751]: 2026-01-27 21:16:48.835 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:16:48 compute-1 nova_compute[183751]: 2026-01-27 21:16:48.836 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:16:48 compute-1 nova_compute[183751]: 2026-01-27 21:16:48.837 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:16:48 compute-1 nova_compute[183751]: 2026-01-27 21:16:48.837 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:16:49 compute-1 nova_compute[183751]: 2026-01-27 21:16:49.039 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:16:49 compute-1 nova_compute[183751]: 2026-01-27 21:16:49.041 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:16:49 compute-1 nova_compute[183751]: 2026-01-27 21:16:49.061 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:16:49 compute-1 nova_compute[183751]: 2026-01-27 21:16:49.062 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6144MB free_disk=73.18144226074219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:16:49 compute-1 nova_compute[183751]: 2026-01-27 21:16:49.063 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:16:49 compute-1 nova_compute[183751]: 2026-01-27 21:16:49.063 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:16:49 compute-1 openstack_network_exporter[195945]: ERROR   21:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:16:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:16:49 compute-1 openstack_network_exporter[195945]: ERROR   21:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:16:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.314 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.315 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:16:49 up  1:19,  0 user,  load average: 0.10, 0.37, 0.43\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.450 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.738 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.739 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.754 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.788 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:16:50 compute-1 nova_compute[183751]: 2026-01-27 21:16:50.811 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:16:51 compute-1 nova_compute[183751]: 2026-01-27 21:16:51.318 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:16:51 compute-1 nova_compute[183751]: 2026-01-27 21:16:51.830 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:16:51 compute-1 nova_compute[183751]: 2026-01-27 21:16:51.831 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.767s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:16:52 compute-1 nova_compute[183751]: 2026-01-27 21:16:52.826 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.633 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.633 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.634 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.634 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.634 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.634 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.635 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:16:53 compute-1 podman[203937]: 2026-01-27 21:16:53.832089014 +0000 UTC m=+0.140507236 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Jan 27 21:16:53 compute-1 nova_compute[183751]: 2026-01-27 21:16:53.952 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:16:56 compute-1 podman[203964]: 2026-01-27 21:16:56.750486054 +0000 UTC m=+0.060336336 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 21:16:56 compute-1 podman[203963]: 2026-01-27 21:16:56.757849931 +0000 UTC m=+0.069754935 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Jan 27 21:17:05 compute-1 podman[193064]: time="2026-01-27T21:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:17:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:17:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 27 21:17:06 compute-1 podman[204001]: 2026-01-27 21:17:06.770734028 +0000 UTC m=+0.074771833 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:17:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:17:11.162 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:17:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:17:11.163 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:17:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:17:11.163 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:17:19 compute-1 openstack_network_exporter[195945]: ERROR   21:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:17:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:17:19 compute-1 openstack_network_exporter[195945]: ERROR   21:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:17:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:17:24 compute-1 podman[204027]: 2026-01-27 21:17:24.789143223 +0000 UTC m=+0.102406076 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:17:27 compute-1 podman[204054]: 2026-01-27 21:17:27.769219821 +0000 UTC m=+0.070469594 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:17:27 compute-1 podman[204053]: 2026-01-27 21:17:27.769113068 +0000 UTC m=+0.074508166 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41)
Jan 27 21:17:35 compute-1 podman[193064]: time="2026-01-27T21:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:17:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:17:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Jan 27 21:17:37 compute-1 podman[204091]: 2026-01-27 21:17:37.756238053 +0000 UTC m=+0.067313254 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:17:49 compute-1 openstack_network_exporter[195945]: ERROR   21:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:17:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:17:49 compute-1 openstack_network_exporter[195945]: ERROR   21:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:17:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.669 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.860 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.861 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.879 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.880 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6147MB free_disk=73.18194961547852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.880 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:17:50 compute-1 nova_compute[183751]: 2026-01-27 21:17:50.881 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:17:51 compute-1 nova_compute[183751]: 2026-01-27 21:17:51.933 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:17:51 compute-1 nova_compute[183751]: 2026-01-27 21:17:51.933 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:17:50 up  1:20,  0 user,  load average: 0.07, 0.31, 0.40\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:17:51 compute-1 nova_compute[183751]: 2026-01-27 21:17:51.957 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:17:52 compute-1 nova_compute[183751]: 2026-01-27 21:17:52.466 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:17:52 compute-1 nova_compute[183751]: 2026-01-27 21:17:52.981 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:17:52 compute-1 nova_compute[183751]: 2026-01-27 21:17:52.981 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:17:53 compute-1 nova_compute[183751]: 2026-01-27 21:17:53.980 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:53 compute-1 nova_compute[183751]: 2026-01-27 21:17:53.981 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:53 compute-1 nova_compute[183751]: 2026-01-27 21:17:53.981 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:53 compute-1 nova_compute[183751]: 2026-01-27 21:17:53.981 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:53 compute-1 nova_compute[183751]: 2026-01-27 21:17:53.981 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:17:55 compute-1 nova_compute[183751]: 2026-01-27 21:17:55.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:17:55 compute-1 podman[204116]: 2026-01-27 21:17:55.782648592 +0000 UTC m=+0.098665840 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Jan 27 21:17:58 compute-1 podman[204143]: 2026-01-27 21:17:58.748111354 +0000 UTC m=+0.056930211 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 27 21:17:58 compute-1 podman[204142]: 2026-01-27 21:17:58.757325876 +0000 UTC m=+0.072920783 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 27 21:18:05 compute-1 podman[193064]: time="2026-01-27T21:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:18:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:18:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Jan 27 21:18:08 compute-1 podman[204182]: 2026-01-27 21:18:08.756117317 +0000 UTC m=+0.066545292 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:18:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:18:11.164 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:18:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:18:11.164 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:18:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:18:11.164 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:18:19 compute-1 openstack_network_exporter[195945]: ERROR   21:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:18:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:18:19 compute-1 openstack_network_exporter[195945]: ERROR   21:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:18:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:18:26 compute-1 podman[204206]: 2026-01-27 21:18:26.835402501 +0000 UTC m=+0.136626514 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 27 21:18:29 compute-1 podman[204235]: 2026-01-27 21:18:29.761662659 +0000 UTC m=+0.065597769 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:18:29 compute-1 podman[204236]: 2026-01-27 21:18:29.773739352 +0000 UTC m=+0.064755947 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 21:18:35 compute-1 podman[193064]: time="2026-01-27T21:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:18:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:18:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Jan 27 21:18:38 compute-1 sshd-session[204271]: Invalid user solr from 80.94.92.186 port 43182
Jan 27 21:18:38 compute-1 sshd-session[204271]: Connection closed by invalid user solr 80.94.92.186 port 43182 [preauth]
Jan 27 21:18:39 compute-1 podman[204273]: 2026-01-27 21:18:39.764568283 +0000 UTC m=+0.061892266 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:18:49 compute-1 openstack_network_exporter[195945]: ERROR   21:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:18:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:18:49 compute-1 openstack_network_exporter[195945]: ERROR   21:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:18:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:18:50 compute-1 nova_compute[183751]: 2026-01-27 21:18:50.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.670 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.672 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.906 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.908 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.946 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.948 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6170MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.948 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:18:52 compute-1 nova_compute[183751]: 2026-01-27 21:18:52.949 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:18:54 compute-1 nova_compute[183751]: 2026-01-27 21:18:54.004 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:18:54 compute-1 nova_compute[183751]: 2026-01-27 21:18:54.005 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:18:52 up  1:21,  0 user,  load average: 0.27, 0.34, 0.41\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:18:54 compute-1 nova_compute[183751]: 2026-01-27 21:18:54.028 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:18:54 compute-1 nova_compute[183751]: 2026-01-27 21:18:54.534 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:18:55 compute-1 nova_compute[183751]: 2026-01-27 21:18:55.131 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:18:55 compute-1 nova_compute[183751]: 2026-01-27 21:18:55.131 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.182s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:18:56 compute-1 nova_compute[183751]: 2026-01-27 21:18:56.130 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:56 compute-1 nova_compute[183751]: 2026-01-27 21:18:56.644 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:56 compute-1 nova_compute[183751]: 2026-01-27 21:18:56.645 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:56 compute-1 nova_compute[183751]: 2026-01-27 21:18:56.645 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:56 compute-1 nova_compute[183751]: 2026-01-27 21:18:56.646 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:18:56 compute-1 nova_compute[183751]: 2026-01-27 21:18:56.658 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:18:57 compute-1 podman[204299]: 2026-01-27 21:18:57.8002985 +0000 UTC m=+0.106951547 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 21:19:00 compute-1 podman[204327]: 2026-01-27 21:19:00.774274047 +0000 UTC m=+0.077984250 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 21:19:00 compute-1 podman[204326]: 2026-01-27 21:19:00.775565659 +0000 UTC m=+0.084009751 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6)
Jan 27 21:19:05 compute-1 podman[193064]: time="2026-01-27T21:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:19:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:19:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Jan 27 21:19:10 compute-1 podman[204366]: 2026-01-27 21:19:10.747543429 +0000 UTC m=+0.060224934 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:19:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:19:11.165 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:19:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:19:11.165 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:19:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:19:11.165 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:19:18 compute-1 sshd-session[204392]: error: kex_exchange_identification: read: Connection reset by peer
Jan 27 21:19:18 compute-1 sshd-session[204392]: Connection reset by 176.120.22.52 port 47016
Jan 27 21:19:19 compute-1 openstack_network_exporter[195945]: ERROR   21:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:19:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:19:19 compute-1 openstack_network_exporter[195945]: ERROR   21:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:19:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:19:28 compute-1 podman[204393]: 2026-01-27 21:19:28.792098682 +0000 UTC m=+0.109653375 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Jan 27 21:19:31 compute-1 podman[204419]: 2026-01-27 21:19:31.754917599 +0000 UTC m=+0.061143637 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Jan 27 21:19:31 compute-1 podman[204420]: 2026-01-27 21:19:31.777037305 +0000 UTC m=+0.083938690 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:19:35 compute-1 podman[193064]: time="2026-01-27T21:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:19:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:19:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Jan 27 21:19:41 compute-1 podman[204458]: 2026-01-27 21:19:41.752270493 +0000 UTC m=+0.060399068 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:19:49 compute-1 openstack_network_exporter[195945]: ERROR   21:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:19:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:19:49 compute-1 openstack_network_exporter[195945]: ERROR   21:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:19:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:19:51 compute-1 nova_compute[183751]: 2026-01-27 21:19:51.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.856 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.858 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.881 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.882 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6176MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.882 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:19:52 compute-1 nova_compute[183751]: 2026-01-27 21:19:52.883 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:19:54 compute-1 nova_compute[183751]: 2026-01-27 21:19:54.057 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:19:54 compute-1 nova_compute[183751]: 2026-01-27 21:19:54.058 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:19:52 up  1:22,  0 user,  load average: 0.16, 0.29, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:19:54 compute-1 nova_compute[183751]: 2026-01-27 21:19:54.094 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:19:54 compute-1 nova_compute[183751]: 2026-01-27 21:19:54.602 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:19:55 compute-1 nova_compute[183751]: 2026-01-27 21:19:55.115 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:19:55 compute-1 nova_compute[183751]: 2026-01-27 21:19:55.116 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.233s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:19:56 compute-1 nova_compute[183751]: 2026-01-27 21:19:56.117 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:56 compute-1 nova_compute[183751]: 2026-01-27 21:19:56.118 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:56 compute-1 nova_compute[183751]: 2026-01-27 21:19:56.118 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:56 compute-1 nova_compute[183751]: 2026-01-27 21:19:56.118 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:56 compute-1 nova_compute[183751]: 2026-01-27 21:19:56.118 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:19:57 compute-1 nova_compute[183751]: 2026-01-27 21:19:57.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:19:59 compute-1 podman[204482]: 2026-01-27 21:19:59.820268152 +0000 UTC m=+0.123258790 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:20:02 compute-1 podman[204509]: 2026-01-27 21:20:02.780810644 +0000 UTC m=+0.077160306 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:20:02 compute-1 podman[204508]: 2026-01-27 21:20:02.814408658 +0000 UTC m=+0.115203180 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 27 21:20:05 compute-1 podman[193064]: time="2026-01-27T21:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:20:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:20:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2151 "" "Go-http-client/1.1"
Jan 27 21:20:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:20:11.166 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:20:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:20:11.167 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:20:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:20:11.167 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:20:12 compute-1 podman[204551]: 2026-01-27 21:20:12.77693427 +0000 UTC m=+0.078356066 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:20:19 compute-1 openstack_network_exporter[195945]: ERROR   21:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:20:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:20:19 compute-1 openstack_network_exporter[195945]: ERROR   21:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:20:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:20:30 compute-1 podman[204576]: 2026-01-27 21:20:30.789776242 +0000 UTC m=+0.093686846 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:20:33 compute-1 podman[204603]: 2026-01-27 21:20:33.750736595 +0000 UTC m=+0.052181746 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Jan 27 21:20:33 compute-1 podman[204602]: 2026-01-27 21:20:33.793371413 +0000 UTC m=+0.098709391 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, version=9.6, config_id=openstack_network_exporter)
Jan 27 21:20:35 compute-1 podman[193064]: time="2026-01-27T21:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:20:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:20:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Jan 27 21:20:43 compute-1 podman[204639]: 2026-01-27 21:20:43.748132231 +0000 UTC m=+0.059406825 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:20:49 compute-1 openstack_network_exporter[195945]: ERROR   21:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:20:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:20:49 compute-1 openstack_network_exporter[195945]: ERROR   21:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:20:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:20:50 compute-1 nova_compute[183751]: 2026-01-27 21:20:50.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:50 compute-1 nova_compute[183751]: 2026-01-27 21:20:50.150 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:20:50 compute-1 nova_compute[183751]: 2026-01-27 21:20:50.660 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:20:51 compute-1 nova_compute[183751]: 2026-01-27 21:20:51.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:51 compute-1 nova_compute[183751]: 2026-01-27 21:20:51.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:51 compute-1 nova_compute[183751]: 2026-01-27 21:20:51.152 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:20:52 compute-1 nova_compute[183751]: 2026-01-27 21:20:52.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:53 compute-1 nova_compute[183751]: 2026-01-27 21:20:53.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:53 compute-1 nova_compute[183751]: 2026-01-27 21:20:53.657 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.173 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.174 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.174 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.174 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.359 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.361 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.385 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.386 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6181MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.387 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:20:54 compute-1 nova_compute[183751]: 2026-01-27 21:20:54.387 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:20:55 compute-1 nova_compute[183751]: 2026-01-27 21:20:55.440 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:20:55 compute-1 nova_compute[183751]: 2026-01-27 21:20:55.441 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:20:54 up  1:23,  0 user,  load average: 0.35, 0.31, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:20:55 compute-1 nova_compute[183751]: 2026-01-27 21:20:55.469 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:20:55 compute-1 nova_compute[183751]: 2026-01-27 21:20:55.979 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:20:56 compute-1 nova_compute[183751]: 2026-01-27 21:20:56.490 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:20:56 compute-1 nova_compute[183751]: 2026-01-27 21:20:56.491 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:20:57 compute-1 nova_compute[183751]: 2026-01-27 21:20:57.983 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:57 compute-1 nova_compute[183751]: 2026-01-27 21:20:57.984 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:57 compute-1 nova_compute[183751]: 2026-01-27 21:20:57.984 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:57 compute-1 nova_compute[183751]: 2026-01-27 21:20:57.985 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:57 compute-1 nova_compute[183751]: 2026-01-27 21:20:57.985 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:20:57 compute-1 nova_compute[183751]: 2026-01-27 21:20:57.986 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:20:58 compute-1 nova_compute[183751]: 2026-01-27 21:20:58.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:01 compute-1 podman[204664]: 2026-01-27 21:21:01.812326459 +0000 UTC m=+0.128307485 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:21:04 compute-1 podman[204691]: 2026-01-27 21:21:04.765799317 +0000 UTC m=+0.063498247 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:21:04 compute-1 podman[204690]: 2026-01-27 21:21:04.795830992 +0000 UTC m=+0.099032179 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Jan 27 21:21:05 compute-1 podman[193064]: time="2026-01-27T21:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:21:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:21:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Jan 27 21:21:09 compute-1 nova_compute[183751]: 2026-01-27 21:21:09.787 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:21:11.168 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:21:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:21:11.168 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:21:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:21:11.168 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:21:14 compute-1 podman[204731]: 2026-01-27 21:21:14.740653074 +0000 UTC m=+0.054859531 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:21:19 compute-1 openstack_network_exporter[195945]: ERROR   21:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:21:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:21:19 compute-1 openstack_network_exporter[195945]: ERROR   21:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:21:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:21:32 compute-1 podman[204755]: 2026-01-27 21:21:32.852653727 +0000 UTC m=+0.160956955 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:21:35 compute-1 podman[193064]: time="2026-01-27T21:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:21:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:21:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Jan 27 21:21:35 compute-1 podman[204783]: 2026-01-27 21:21:35.767073086 +0000 UTC m=+0.061727473 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest)
Jan 27 21:21:35 compute-1 podman[204782]: 2026-01-27 21:21:35.782799276 +0000 UTC m=+0.094380083 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:21:45 compute-1 podman[204822]: 2026-01-27 21:21:45.737354859 +0000 UTC m=+0.049662353 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:21:49 compute-1 openstack_network_exporter[195945]: ERROR   21:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:21:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:21:49 compute-1 openstack_network_exporter[195945]: ERROR   21:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:21:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:21:52 compute-1 nova_compute[183751]: 2026-01-27 21:21:52.661 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:53 compute-1 nova_compute[183751]: 2026-01-27 21:21:53.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.822 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.823 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.835 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.836 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6180MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.836 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:21:54 compute-1 nova_compute[183751]: 2026-01-27 21:21:54.836 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.339 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.340 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:21:54 up  1:24,  0 user,  load average: 0.13, 0.25, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.482 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.593 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.593 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.612 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.645 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:21:56 compute-1 nova_compute[183751]: 2026-01-27 21:21:56.666 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:21:57 compute-1 nova_compute[183751]: 2026-01-27 21:21:57.174 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:21:57 compute-1 nova_compute[183751]: 2026-01-27 21:21:57.686 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:21:57 compute-1 nova_compute[183751]: 2026-01-27 21:21:57.687 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.850s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:21:58 compute-1 nova_compute[183751]: 2026-01-27 21:21:58.687 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:58 compute-1 nova_compute[183751]: 2026-01-27 21:21:58.687 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:58 compute-1 nova_compute[183751]: 2026-01-27 21:21:58.688 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:58 compute-1 nova_compute[183751]: 2026-01-27 21:21:58.688 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:58 compute-1 nova_compute[183751]: 2026-01-27 21:21:58.688 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:21:58 compute-1 nova_compute[183751]: 2026-01-27 21:21:58.688 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:22:03 compute-1 podman[204848]: 2026-01-27 21:22:03.826813183 +0000 UTC m=+0.130022528 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 27 21:22:05 compute-1 podman[193064]: time="2026-01-27T21:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:22:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:22:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Jan 27 21:22:06 compute-1 podman[204875]: 2026-01-27 21:22:06.762011163 +0000 UTC m=+0.060785327 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260126, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 27 21:22:06 compute-1 podman[204874]: 2026-01-27 21:22:06.781688277 +0000 UTC m=+0.078642796 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:22:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:22:11.169 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:22:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:22:11.170 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:22:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:22:11.170 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:22:16 compute-1 podman[204913]: 2026-01-27 21:22:16.757623237 +0000 UTC m=+0.064081929 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:22:19 compute-1 openstack_network_exporter[195945]: ERROR   21:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:22:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:22:19 compute-1 openstack_network_exporter[195945]: ERROR   21:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:22:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:22:34 compute-1 podman[204938]: 2026-01-27 21:22:34.803311696 +0000 UTC m=+0.108153293 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:22:35 compute-1 podman[193064]: time="2026-01-27T21:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:22:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:22:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2159 "" "Go-http-client/1.1"
Jan 27 21:22:37 compute-1 podman[204966]: 2026-01-27 21:22:37.762159341 +0000 UTC m=+0.067402170 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:22:37 compute-1 podman[204967]: 2026-01-27 21:22:37.763698989 +0000 UTC m=+0.066734164 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:22:47 compute-1 podman[205005]: 2026-01-27 21:22:47.784464119 +0000 UTC m=+0.083834884 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:22:49 compute-1 openstack_network_exporter[195945]: ERROR   21:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:22:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:22:49 compute-1 openstack_network_exporter[195945]: ERROR   21:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:22:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:22:53 compute-1 nova_compute[183751]: 2026-01-27 21:22:53.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:22:55 compute-1 nova_compute[183751]: 2026-01-27 21:22:55.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.894 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.896 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.935 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.936 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6186MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.936 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:22:56 compute-1 nova_compute[183751]: 2026-01-27 21:22:56.936 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:22:57 compute-1 nova_compute[183751]: 2026-01-27 21:22:57.993 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:22:57 compute-1 nova_compute[183751]: 2026-01-27 21:22:57.994 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:22:56 up  1:25,  0 user,  load average: 0.08, 0.22, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:22:58 compute-1 nova_compute[183751]: 2026-01-27 21:22:58.036 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:22:58 compute-1 nova_compute[183751]: 2026-01-27 21:22:58.543 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:22:59 compute-1 nova_compute[183751]: 2026-01-27 21:22:59.053 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:22:59 compute-1 nova_compute[183751]: 2026-01-27 21:22:59.053 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:23:00 compute-1 nova_compute[183751]: 2026-01-27 21:23:00.052 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:00 compute-1 nova_compute[183751]: 2026-01-27 21:23:00.053 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:00 compute-1 nova_compute[183751]: 2026-01-27 21:23:00.053 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:00 compute-1 nova_compute[183751]: 2026-01-27 21:23:00.053 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:00 compute-1 nova_compute[183751]: 2026-01-27 21:23:00.053 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:00 compute-1 nova_compute[183751]: 2026-01-27 21:23:00.054 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:23:00 compute-1 nova_compute[183751]: 2026-01-27 21:23:00.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:05 compute-1 podman[193064]: time="2026-01-27T21:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:23:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:23:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2161 "" "Go-http-client/1.1"
Jan 27 21:23:05 compute-1 podman[205033]: 2026-01-27 21:23:05.820281235 +0000 UTC m=+0.127595351 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:23:08 compute-1 podman[205060]: 2026-01-27 21:23:08.768420788 +0000 UTC m=+0.074345911 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 21:23:08 compute-1 podman[205061]: 2026-01-27 21:23:08.77905393 +0000 UTC m=+0.071076381 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126)
Jan 27 21:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:23:11.171 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:23:11.171 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:23:11.171 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:23:18 compute-1 podman[205100]: 2026-01-27 21:23:18.748031157 +0000 UTC m=+0.058631925 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:23:19 compute-1 openstack_network_exporter[195945]: ERROR   21:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:23:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:23:19 compute-1 openstack_network_exporter[195945]: ERROR   21:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:23:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:23:35 compute-1 podman[193064]: time="2026-01-27T21:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:23:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:23:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Jan 27 21:23:36 compute-1 podman[205124]: 2026-01-27 21:23:36.785199956 +0000 UTC m=+0.098579367 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest)
Jan 27 21:23:39 compute-1 podman[205152]: 2026-01-27 21:23:39.754765146 +0000 UTC m=+0.065266377 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 27 21:23:39 compute-1 podman[205153]: 2026-01-27 21:23:39.764615347 +0000 UTC m=+0.067281506 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:23:42 compute-1 sshd-session[205194]: Invalid user ubuntu from 80.94.92.186 port 46232
Jan 27 21:23:42 compute-1 sshd-session[205194]: Connection closed by invalid user ubuntu 80.94.92.186 port 46232 [preauth]
Jan 27 21:23:49 compute-1 openstack_network_exporter[195945]: ERROR   21:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:23:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:23:49 compute-1 openstack_network_exporter[195945]: ERROR   21:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:23:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:23:49 compute-1 podman[205196]: 2026-01-27 21:23:49.762005364 +0000 UTC m=+0.071697356 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:23:54 compute-1 nova_compute[183751]: 2026-01-27 21:23:54.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.672 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.835 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.836 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.848 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.849 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6180MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.849 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:23:57 compute-1 nova_compute[183751]: 2026-01-27 21:23:57.849 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:23:58 compute-1 nova_compute[183751]: 2026-01-27 21:23:58.906 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:23:58 compute-1 nova_compute[183751]: 2026-01-27 21:23:58.906 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:23:57 up  1:26,  0 user,  load average: 0.10, 0.20, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:23:58 compute-1 nova_compute[183751]: 2026-01-27 21:23:58.938 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:23:59 compute-1 nova_compute[183751]: 2026-01-27 21:23:59.448 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:23:59 compute-1 nova_compute[183751]: 2026-01-27 21:23:59.957 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:23:59 compute-1 nova_compute[183751]: 2026-01-27 21:23:59.958 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:24:02 compute-1 nova_compute[183751]: 2026-01-27 21:24:02.960 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:02 compute-1 nova_compute[183751]: 2026-01-27 21:24:02.960 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:02 compute-1 nova_compute[183751]: 2026-01-27 21:24:02.960 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:02 compute-1 nova_compute[183751]: 2026-01-27 21:24:02.961 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:02 compute-1 nova_compute[183751]: 2026-01-27 21:24:02.961 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:24:05 compute-1 podman[193064]: time="2026-01-27T21:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:24:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:24:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Jan 27 21:24:07 compute-1 podman[205223]: 2026-01-27 21:24:07.794860556 +0000 UTC m=+0.099395227 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:24:10 compute-1 podman[205251]: 2026-01-27 21:24:10.738729003 +0000 UTC m=+0.046853844 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126)
Jan 27 21:24:10 compute-1 podman[205250]: 2026-01-27 21:24:10.739964924 +0000 UTC m=+0.054773959 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Jan 27 21:24:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:24:11.172 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:24:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:24:11.172 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:24:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:24:11.173 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:24:19 compute-1 openstack_network_exporter[195945]: ERROR   21:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:24:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:24:19 compute-1 openstack_network_exporter[195945]: ERROR   21:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:24:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:24:20 compute-1 podman[205292]: 2026-01-27 21:24:20.77323009 +0000 UTC m=+0.077930904 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:24:35 compute-1 podman[193064]: time="2026-01-27T21:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:24:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:24:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Jan 27 21:24:38 compute-1 podman[205316]: 2026-01-27 21:24:38.770680988 +0000 UTC m=+0.085192516 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:24:41 compute-1 podman[205341]: 2026-01-27 21:24:41.779562934 +0000 UTC m=+0.084267304 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Jan 27 21:24:41 compute-1 podman[205342]: 2026-01-27 21:24:41.800004198 +0000 UTC m=+0.088323250 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 21:24:49 compute-1 openstack_network_exporter[195945]: ERROR   21:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:24:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:24:49 compute-1 openstack_network_exporter[195945]: ERROR   21:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:24:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:24:51 compute-1 podman[205381]: 2026-01-27 21:24:51.754287274 +0000 UTC m=+0.063757979 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:24:55 compute-1 nova_compute[183751]: 2026-01-27 21:24:55.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:57 compute-1 nova_compute[183751]: 2026-01-27 21:24:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:58 compute-1 nova_compute[183751]: 2026-01-27 21:24:58.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.672 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.869 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.870 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.903 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.904 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6175MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.905 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:24:59 compute-1 nova_compute[183751]: 2026-01-27 21:24:59.905 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:25:00 compute-1 nova_compute[183751]: 2026-01-27 21:25:00.983 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:25:00 compute-1 nova_compute[183751]: 2026-01-27 21:25:00.983 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:24:59 up  1:27,  0 user,  load average: 0.09, 0.18, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:25:01 compute-1 nova_compute[183751]: 2026-01-27 21:25:01.013 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:25:01 compute-1 nova_compute[183751]: 2026-01-27 21:25:01.520 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:25:02 compute-1 nova_compute[183751]: 2026-01-27 21:25:02.030 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:25:02 compute-1 nova_compute[183751]: 2026-01-27 21:25:02.031 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:25:03 compute-1 nova_compute[183751]: 2026-01-27 21:25:03.032 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:03 compute-1 nova_compute[183751]: 2026-01-27 21:25:03.033 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:03 compute-1 nova_compute[183751]: 2026-01-27 21:25:03.643 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:03 compute-1 nova_compute[183751]: 2026-01-27 21:25:03.644 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:03 compute-1 nova_compute[183751]: 2026-01-27 21:25:03.644 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:03 compute-1 nova_compute[183751]: 2026-01-27 21:25:03.645 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:25:05 compute-1 podman[193064]: time="2026-01-27T21:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:25:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:25:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Jan 27 21:25:09 compute-1 podman[205406]: 2026-01-27 21:25:09.793802756 +0000 UTC m=+0.101705407 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 27 21:25:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:25:11.174 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:25:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:25:11.174 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:25:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:25:11.175 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:25:12 compute-1 podman[205436]: 2026-01-27 21:25:12.795678018 +0000 UTC m=+0.087500111 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 21:25:12 compute-1 podman[205435]: 2026-01-27 21:25:12.80635674 +0000 UTC m=+0.105896895 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Jan 27 21:25:19 compute-1 openstack_network_exporter[195945]: ERROR   21:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:25:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:25:19 compute-1 openstack_network_exporter[195945]: ERROR   21:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:25:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:25:22 compute-1 podman[205477]: 2026-01-27 21:25:22.752466693 +0000 UTC m=+0.065454479 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:25:35 compute-1 podman[193064]: time="2026-01-27T21:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:25:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:25:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Jan 27 21:25:40 compute-1 podman[205501]: 2026-01-27 21:25:40.79206414 +0000 UTC m=+0.106684324 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4)
Jan 27 21:25:43 compute-1 podman[205528]: 2026-01-27 21:25:43.771656023 +0000 UTC m=+0.063997054 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:25:43 compute-1 podman[205527]: 2026-01-27 21:25:43.772207366 +0000 UTC m=+0.081345045 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public)
Jan 27 21:25:49 compute-1 openstack_network_exporter[195945]: ERROR   21:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:25:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:25:49 compute-1 openstack_network_exporter[195945]: ERROR   21:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:25:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:25:53 compute-1 podman[205567]: 2026-01-27 21:25:53.767937753 +0000 UTC m=+0.070617381 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:25:55 compute-1 nova_compute[183751]: 2026-01-27 21:25:55.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:55 compute-1 nova_compute[183751]: 2026-01-27 21:25:55.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:25:57 compute-1 nova_compute[183751]: 2026-01-27 21:25:57.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:58 compute-1 nova_compute[183751]: 2026-01-27 21:25:58.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:59 compute-1 nova_compute[183751]: 2026-01-27 21:25:59.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:59 compute-1 nova_compute[183751]: 2026-01-27 21:25:59.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:25:59 compute-1 nova_compute[183751]: 2026-01-27 21:25:59.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:25:59 compute-1 nova_compute[183751]: 2026-01-27 21:25:59.659 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:26:00 compute-1 nova_compute[183751]: 2026-01-27 21:26:00.659 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:26:00 compute-1 nova_compute[183751]: 2026-01-27 21:26:00.660 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.176 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.177 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.177 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.177 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.337 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.337 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.353 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.354 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6182MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.354 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:26:01 compute-1 nova_compute[183751]: 2026-01-27 21:26:01.355 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:26:02 compute-1 nova_compute[183751]: 2026-01-27 21:26:02.454 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:26:02 compute-1 nova_compute[183751]: 2026-01-27 21:26:02.455 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:26:01 up  1:28,  0 user,  load average: 0.03, 0.14, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:26:02 compute-1 nova_compute[183751]: 2026-01-27 21:26:02.499 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:26:03 compute-1 nova_compute[183751]: 2026-01-27 21:26:03.009 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:26:03 compute-1 nova_compute[183751]: 2026-01-27 21:26:03.520 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:26:03 compute-1 nova_compute[183751]: 2026-01-27 21:26:03.521 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.166s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:26:05 compute-1 podman[193064]: time="2026-01-27T21:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:26:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:26:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2161 "" "Go-http-client/1.1"
Jan 27 21:26:06 compute-1 nova_compute[183751]: 2026-01-27 21:26:06.006 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:26:06 compute-1 nova_compute[183751]: 2026-01-27 21:26:06.006 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:26:06 compute-1 nova_compute[183751]: 2026-01-27 21:26:06.007 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:26:06 compute-1 nova_compute[183751]: 2026-01-27 21:26:06.007 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:26:06 compute-1 nova_compute[183751]: 2026-01-27 21:26:06.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:26:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:26:11.176 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:26:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:26:11.176 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:26:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:26:11.176 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:26:11 compute-1 podman[205593]: 2026-01-27 21:26:11.782783292 +0000 UTC m=+0.095041119 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 21:26:14 compute-1 podman[205623]: 2026-01-27 21:26:14.77523459 +0000 UTC m=+0.071286716 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 27 21:26:14 compute-1 podman[205622]: 2026-01-27 21:26:14.789826045 +0000 UTC m=+0.099913053 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-type=git)
Jan 27 21:26:19 compute-1 openstack_network_exporter[195945]: ERROR   21:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:26:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:26:19 compute-1 openstack_network_exporter[195945]: ERROR   21:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:26:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:26:24 compute-1 podman[205662]: 2026-01-27 21:26:24.773346104 +0000 UTC m=+0.083695100 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:26:35 compute-1 podman[193064]: time="2026-01-27T21:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:26:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:26:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Jan 27 21:26:42 compute-1 podman[205687]: 2026-01-27 21:26:42.818673429 +0000 UTC m=+0.128589383 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:26:45 compute-1 podman[205713]: 2026-01-27 21:26:45.759852273 +0000 UTC m=+0.064379264 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, config_id=openstack_network_exporter, version=9.6, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Jan 27 21:26:45 compute-1 podman[205714]: 2026-01-27 21:26:45.781047097 +0000 UTC m=+0.072744411 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 21:26:49 compute-1 openstack_network_exporter[195945]: ERROR   21:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:26:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:26:49 compute-1 openstack_network_exporter[195945]: ERROR   21:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:26:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:26:55 compute-1 podman[205754]: 2026-01-27 21:26:55.74169193 +0000 UTC m=+0.059463393 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:26:57 compute-1 nova_compute[183751]: 2026-01-27 21:26:57.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:26:59 compute-1 nova_compute[183751]: 2026-01-27 21:26:59.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:00 compute-1 nova_compute[183751]: 2026-01-27 21:27:00.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:27:01 compute-1 sshd-session[205778]: Invalid user admin from 45.148.10.121 port 59576
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.881 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.883 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.900 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.901 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6175MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.901 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:27:01 compute-1 nova_compute[183751]: 2026-01-27 21:27:01.901 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:27:01 compute-1 sshd-session[205778]: Connection closed by invalid user admin 45.148.10.121 port 59576 [preauth]
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.071 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.071 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:27:01 up  1:29,  0 user,  load average: 0.04, 0.13, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.223 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.377 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.378 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.397 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.424 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.445 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:27:03 compute-1 nova_compute[183751]: 2026-01-27 21:27:03.954 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:27:04 compute-1 nova_compute[183751]: 2026-01-27 21:27:04.464 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:27:04 compute-1 nova_compute[183751]: 2026-01-27 21:27:04.465 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.563s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:27:05 compute-1 nova_compute[183751]: 2026-01-27 21:27:05.466 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:05 compute-1 nova_compute[183751]: 2026-01-27 21:27:05.466 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:05 compute-1 podman[193064]: time="2026-01-27T21:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:27:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:27:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2159 "" "Go-http-client/1.1"
Jan 27 21:27:05 compute-1 nova_compute[183751]: 2026-01-27 21:27:05.979 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:05 compute-1 nova_compute[183751]: 2026-01-27 21:27:05.980 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:05 compute-1 nova_compute[183751]: 2026-01-27 21:27:05.980 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:05 compute-1 nova_compute[183751]: 2026-01-27 21:27:05.980 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:27:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:27:11.177 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:27:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:27:11.178 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:27:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:27:11.178 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:27:13 compute-1 podman[205782]: 2026-01-27 21:27:13.768811464 +0000 UTC m=+0.085409444 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 27 21:27:16 compute-1 podman[205810]: 2026-01-27 21:27:16.743933238 +0000 UTC m=+0.057014811 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 27 21:27:16 compute-1 podman[205809]: 2026-01-27 21:27:16.763881212 +0000 UTC m=+0.078652827 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc.)
Jan 27 21:27:19 compute-1 openstack_network_exporter[195945]: ERROR   21:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:27:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:27:19 compute-1 openstack_network_exporter[195945]: ERROR   21:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:27:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:27:26 compute-1 podman[205848]: 2026-01-27 21:27:26.734773317 +0000 UTC m=+0.049690640 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:27:35 compute-1 podman[193064]: time="2026-01-27T21:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:27:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:27:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 27 21:27:44 compute-1 podman[205872]: 2026-01-27 21:27:44.810404331 +0000 UTC m=+0.127303591 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:27:47 compute-1 podman[205900]: 2026-01-27 21:27:47.780619203 +0000 UTC m=+0.081272872 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:27:47 compute-1 podman[205899]: 2026-01-27 21:27:47.783443053 +0000 UTC m=+0.090112301 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Jan 27 21:27:49 compute-1 openstack_network_exporter[195945]: ERROR   21:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:27:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:27:49 compute-1 openstack_network_exporter[195945]: ERROR   21:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:27:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:27:57 compute-1 nova_compute[183751]: 2026-01-27 21:27:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:27:57 compute-1 podman[205939]: 2026-01-27 21:27:57.737956733 +0000 UTC m=+0.053547256 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:27:59 compute-1 nova_compute[183751]: 2026-01-27 21:27:59.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:00 compute-1 nova_compute[183751]: 2026-01-27 21:28:00.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.810 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.811 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.827 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.828 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6175MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.828 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:28:01 compute-1 nova_compute[183751]: 2026-01-27 21:28:01.828 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:28:02 compute-1 nova_compute[183751]: 2026-01-27 21:28:02.884 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:28:02 compute-1 nova_compute[183751]: 2026-01-27 21:28:02.884 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:28:01 up  1:30,  0 user,  load average: 0.09, 0.13, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:28:02 compute-1 nova_compute[183751]: 2026-01-27 21:28:02.918 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:28:03 compute-1 nova_compute[183751]: 2026-01-27 21:28:03.434 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:28:03 compute-1 nova_compute[183751]: 2026-01-27 21:28:03.943 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:28:03 compute-1 nova_compute[183751]: 2026-01-27 21:28:03.944 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:28:05 compute-1 podman[193064]: time="2026-01-27T21:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:28:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:28:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 27 21:28:05 compute-1 nova_compute[183751]: 2026-01-27 21:28:05.944 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:05 compute-1 nova_compute[183751]: 2026-01-27 21:28:05.945 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:05 compute-1 nova_compute[183751]: 2026-01-27 21:28:05.945 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:07 compute-1 nova_compute[183751]: 2026-01-27 21:28:07.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:07 compute-1 nova_compute[183751]: 2026-01-27 21:28:07.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:28:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:28:11.179 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:28:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:28:11.179 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:28:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:28:11.179 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:28:15 compute-1 podman[205966]: 2026-01-27 21:28:15.815524645 +0000 UTC m=+0.123907047 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 27 21:28:18 compute-1 podman[205993]: 2026-01-27 21:28:18.729555187 +0000 UTC m=+0.047752323 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:28:18 compute-1 podman[205994]: 2026-01-27 21:28:18.729553087 +0000 UTC m=+0.045984539 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Jan 27 21:28:19 compute-1 openstack_network_exporter[195945]: ERROR   21:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:28:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:28:19 compute-1 openstack_network_exporter[195945]: ERROR   21:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:28:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:28:28 compute-1 podman[206034]: 2026-01-27 21:28:28.737676815 +0000 UTC m=+0.054489759 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:28:34 compute-1 sshd-session[206058]: Invalid user ubuntu from 80.94.92.186 port 49248
Jan 27 21:28:34 compute-1 sshd-session[206058]: Connection closed by invalid user ubuntu 80.94.92.186 port 49248 [preauth]
Jan 27 21:28:35 compute-1 podman[193064]: time="2026-01-27T21:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:28:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:28:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2159 "" "Go-http-client/1.1"
Jan 27 21:28:46 compute-1 podman[206060]: 2026-01-27 21:28:46.826852897 +0000 UTC m=+0.145318262 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:28:49 compute-1 openstack_network_exporter[195945]: ERROR   21:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:28:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:28:49 compute-1 openstack_network_exporter[195945]: ERROR   21:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:28:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:28:49 compute-1 podman[206085]: 2026-01-27 21:28:49.759714932 +0000 UTC m=+0.069321724 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter)
Jan 27 21:28:49 compute-1 podman[206086]: 2026-01-27 21:28:49.785559641 +0000 UTC m=+0.086802396 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 21:28:57 compute-1 nova_compute[183751]: 2026-01-27 21:28:57.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:59 compute-1 nova_compute[183751]: 2026-01-27 21:28:59.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:28:59 compute-1 podman[206125]: 2026-01-27 21:28:59.799072227 +0000 UTC m=+0.107936208 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.670 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.670 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.671 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.818 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.819 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.851 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.852 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6177MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.852 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:29:01 compute-1 nova_compute[183751]: 2026-01-27 21:29:01.852 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:29:02 compute-1 nova_compute[183751]: 2026-01-27 21:29:02.966 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:29:02 compute-1 nova_compute[183751]: 2026-01-27 21:29:02.966 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:29:01 up  1:31,  0 user,  load average: 0.03, 0.11, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:29:02 compute-1 nova_compute[183751]: 2026-01-27 21:29:02.989 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:29:03 compute-1 nova_compute[183751]: 2026-01-27 21:29:03.500 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:29:04 compute-1 nova_compute[183751]: 2026-01-27 21:29:04.012 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:29:04 compute-1 nova_compute[183751]: 2026-01-27 21:29:04.013 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.160s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:29:05 compute-1 nova_compute[183751]: 2026-01-27 21:29:05.013 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:29:05 compute-1 nova_compute[183751]: 2026-01-27 21:29:05.549 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:29:05 compute-1 nova_compute[183751]: 2026-01-27 21:29:05.549 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:29:05 compute-1 nova_compute[183751]: 2026-01-27 21:29:05.549 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:29:05 compute-1 podman[193064]: time="2026-01-27T21:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:29:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:29:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2159 "" "Go-http-client/1.1"
Jan 27 21:29:05 compute-1 nova_compute[183751]: 2026-01-27 21:29:05.679 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:29:07 compute-1 nova_compute[183751]: 2026-01-27 21:29:07.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:29:07 compute-1 nova_compute[183751]: 2026-01-27 21:29:07.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:29:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:29:11.180 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:29:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:29:11.181 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:29:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:29:11.181 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:29:17 compute-1 podman[206153]: 2026-01-27 21:29:17.870058802 +0000 UTC m=+0.176561383 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 27 21:29:19 compute-1 openstack_network_exporter[195945]: ERROR   21:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:29:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:29:19 compute-1 openstack_network_exporter[195945]: ERROR   21:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:29:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:29:20 compute-1 podman[206180]: 2026-01-27 21:29:20.768681912 +0000 UTC m=+0.067500219 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 27 21:29:20 compute-1 podman[206179]: 2026-01-27 21:29:20.773421549 +0000 UTC m=+0.076995833 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 27 21:29:30 compute-1 podman[206218]: 2026-01-27 21:29:30.774449897 +0000 UTC m=+0.075027924 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:29:35 compute-1 podman[193064]: time="2026-01-27T21:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:29:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:29:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Jan 27 21:29:48 compute-1 podman[206242]: 2026-01-27 21:29:48.785030069 +0000 UTC m=+0.092290961 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 27 21:29:49 compute-1 openstack_network_exporter[195945]: ERROR   21:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:29:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:29:49 compute-1 openstack_network_exporter[195945]: ERROR   21:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:29:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:29:51 compute-1 podman[206269]: 2026-01-27 21:29:51.755031744 +0000 UTC m=+0.064473834 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350)
Jan 27 21:29:51 compute-1 podman[206270]: 2026-01-27 21:29:51.772948526 +0000 UTC m=+0.066269388 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 21:29:59 compute-1 nova_compute[183751]: 2026-01-27 21:29:59.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:01 compute-1 nova_compute[183751]: 2026-01-27 21:30:01.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:01 compute-1 podman[206309]: 2026-01-27 21:30:01.741071622 +0000 UTC m=+0.054995959 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.791 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.792 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.812 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.812 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6181MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.813 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:30:02 compute-1 nova_compute[183751]: 2026-01-27 21:30:02.813 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:30:03 compute-1 nova_compute[183751]: 2026-01-27 21:30:03.877 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:30:03 compute-1 nova_compute[183751]: 2026-01-27 21:30:03.877 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:30:02 up  1:32,  0 user,  load average: 0.01, 0.08, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:30:03 compute-1 nova_compute[183751]: 2026-01-27 21:30:03.902 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:30:04 compute-1 nova_compute[183751]: 2026-01-27 21:30:04.410 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:30:04 compute-1 nova_compute[183751]: 2026-01-27 21:30:04.920 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:30:04 compute-1 nova_compute[183751]: 2026-01-27 21:30:04.920 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:30:05 compute-1 podman[193064]: time="2026-01-27T21:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:30:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:30:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Jan 27 21:30:05 compute-1 nova_compute[183751]: 2026-01-27 21:30:05.921 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:05 compute-1 nova_compute[183751]: 2026-01-27 21:30:05.921 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:07 compute-1 nova_compute[183751]: 2026-01-27 21:30:07.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:07 compute-1 nova_compute[183751]: 2026-01-27 21:30:07.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:09 compute-1 nova_compute[183751]: 2026-01-27 21:30:09.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:30:09 compute-1 nova_compute[183751]: 2026-01-27 21:30:09.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:30:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:30:11.182 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:30:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:30:11.182 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:30:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:30:11.182 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:30:19 compute-1 openstack_network_exporter[195945]: ERROR   21:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:30:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:30:19 compute-1 openstack_network_exporter[195945]: ERROR   21:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:30:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:30:19 compute-1 podman[206336]: 2026-01-27 21:30:19.791162001 +0000 UTC m=+0.097295785 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 27 21:30:22 compute-1 podman[206362]: 2026-01-27 21:30:22.781016605 +0000 UTC m=+0.085157935 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6)
Jan 27 21:30:22 compute-1 podman[206363]: 2026-01-27 21:30:22.804224888 +0000 UTC m=+0.099519879 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 27 21:30:32 compute-1 podman[206402]: 2026-01-27 21:30:32.726622305 +0000 UTC m=+0.041330452 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:30:35 compute-1 podman[193064]: time="2026-01-27T21:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:30:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:30:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2160 "" "Go-http-client/1.1"
Jan 27 21:30:49 compute-1 openstack_network_exporter[195945]: ERROR   21:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:30:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:30:49 compute-1 openstack_network_exporter[195945]: ERROR   21:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:30:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:30:50 compute-1 podman[206425]: 2026-01-27 21:30:50.806878687 +0000 UTC m=+0.105825115 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:30:53 compute-1 podman[206452]: 2026-01-27 21:30:53.761745067 +0000 UTC m=+0.069932649 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, name=ubi9-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=)
Jan 27 21:30:53 compute-1 podman[206453]: 2026-01-27 21:30:53.780678615 +0000 UTC m=+0.080428589 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent)
Jan 27 21:31:01 compute-1 nova_compute[183751]: 2026-01-27 21:31:01.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:01 compute-1 nova_compute[183751]: 2026-01-27 21:31:01.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:01 compute-1 nova_compute[183751]: 2026-01-27 21:31:01.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:31:02 compute-1 nova_compute[183751]: 2026-01-27 21:31:02.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:03 compute-1 nova_compute[183751]: 2026-01-27 21:31:03.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:03 compute-1 podman[206492]: 2026-01-27 21:31:03.768954237 +0000 UTC m=+0.073743363 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:31:04 compute-1 nova_compute[183751]: 2026-01-27 21:31:04.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:04 compute-1 nova_compute[183751]: 2026-01-27 21:31:04.658 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.173 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.174 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.174 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.174 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.324 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.326 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.348 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.349 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6181MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.349 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:31:05 compute-1 nova_compute[183751]: 2026-01-27 21:31:05.349 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:31:05 compute-1 podman[193064]: time="2026-01-27T21:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:31:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:31:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Jan 27 21:31:06 compute-1 nova_compute[183751]: 2026-01-27 21:31:06.406 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:31:06 compute-1 nova_compute[183751]: 2026-01-27 21:31:06.406 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:31:05 up  1:33,  0 user,  load average: 0.00, 0.06, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:31:06 compute-1 nova_compute[183751]: 2026-01-27 21:31:06.429 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:31:06 compute-1 nova_compute[183751]: 2026-01-27 21:31:06.940 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:31:07 compute-1 nova_compute[183751]: 2026-01-27 21:31:07.454 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:31:07 compute-1 nova_compute[183751]: 2026-01-27 21:31:07.455 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:31:08 compute-1 nova_compute[183751]: 2026-01-27 21:31:08.946 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:08 compute-1 nova_compute[183751]: 2026-01-27 21:31:08.947 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:09 compute-1 nova_compute[183751]: 2026-01-27 21:31:09.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:10 compute-1 nova_compute[183751]: 2026-01-27 21:31:10.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:10 compute-1 nova_compute[183751]: 2026-01-27 21:31:10.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:31:10 compute-1 nova_compute[183751]: 2026-01-27 21:31:10.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:10 compute-1 nova_compute[183751]: 2026-01-27 21:31:10.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:31:10 compute-1 nova_compute[183751]: 2026-01-27 21:31:10.680 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:31:10 compute-1 nova_compute[183751]: 2026-01-27 21:31:10.680 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:31:11.184 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:31:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:31:11.184 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:31:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:31:11.184 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:31:19 compute-1 openstack_network_exporter[195945]: ERROR   21:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:31:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:31:19 compute-1 openstack_network_exporter[195945]: ERROR   21:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:31:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:31:21 compute-1 podman[206519]: 2026-01-27 21:31:21.811839126 +0000 UTC m=+0.118282783 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 21:31:24 compute-1 podman[206545]: 2026-01-27 21:31:24.739858793 +0000 UTC m=+0.056335993 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Jan 27 21:31:24 compute-1 podman[206546]: 2026-01-27 21:31:24.740351246 +0000 UTC m=+0.051737200 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:31:34 compute-1 podman[206586]: 2026-01-27 21:31:34.766127636 +0000 UTC m=+0.082593902 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:31:35 compute-1 podman[193064]: time="2026-01-27T21:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:31:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:31:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 27 21:31:37 compute-1 nova_compute[183751]: 2026-01-27 21:31:37.832 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:31:49 compute-1 openstack_network_exporter[195945]: ERROR   21:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:31:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:31:49 compute-1 openstack_network_exporter[195945]: ERROR   21:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:31:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:31:52 compute-1 podman[206610]: 2026-01-27 21:31:52.767700425 +0000 UTC m=+0.074059331 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 21:31:55 compute-1 podman[206640]: 2026-01-27 21:31:55.782331042 +0000 UTC m=+0.078829899 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 21:31:55 compute-1 podman[206639]: 2026-01-27 21:31:55.787985201 +0000 UTC m=+0.091257345 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:32:03 compute-1 nova_compute[183751]: 2026-01-27 21:32:03.730 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:03 compute-1 nova_compute[183751]: 2026-01-27 21:32:03.730 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:03 compute-1 nova_compute[183751]: 2026-01-27 21:32:03.731 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.704 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.704 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.705 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.705 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.860 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.862 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.894 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.895 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6170MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.895 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:32:04 compute-1 nova_compute[183751]: 2026-01-27 21:32:04.895 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:32:05 compute-1 podman[206681]: 2026-01-27 21:32:05.773249662 +0000 UTC m=+0.074458041 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:32:05 compute-1 podman[193064]: time="2026-01-27T21:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:32:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:32:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.229 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.229 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:32:04 up  1:34,  0 user,  load average: 0.00, 0.05, 0.19\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.350 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.452 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.453 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.468 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.491 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:32:06 compute-1 nova_compute[183751]: 2026-01-27 21:32:06.512 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:32:07 compute-1 nova_compute[183751]: 2026-01-27 21:32:07.029 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:32:07 compute-1 nova_compute[183751]: 2026-01-27 21:32:07.631 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:32:07 compute-1 nova_compute[183751]: 2026-01-27 21:32:07.631 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.736s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:32:10 compute-1 nova_compute[183751]: 2026-01-27 21:32:10.632 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:10 compute-1 nova_compute[183751]: 2026-01-27 21:32:10.633 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:10 compute-1 nova_compute[183751]: 2026-01-27 21:32:10.633 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:10 compute-1 nova_compute[183751]: 2026-01-27 21:32:10.633 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:32:10 compute-1 nova_compute[183751]: 2026-01-27 21:32:10.634 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:32:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:32:11.185 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:32:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:32:11.186 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:32:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:32:11.186 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:32:19 compute-1 openstack_network_exporter[195945]: ERROR   21:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:32:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:32:19 compute-1 openstack_network_exporter[195945]: ERROR   21:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:32:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:32:23 compute-1 podman[206706]: 2026-01-27 21:32:23.751013462 +0000 UTC m=+0.067440998 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:32:26 compute-1 podman[206733]: 2026-01-27 21:32:26.734714734 +0000 UTC m=+0.052067348 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:32:26 compute-1 podman[206734]: 2026-01-27 21:32:26.734935849 +0000 UTC m=+0.048972541 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 27 21:32:35 compute-1 podman[193064]: time="2026-01-27T21:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:32:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:32:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2161 "" "Go-http-client/1.1"
Jan 27 21:32:36 compute-1 podman[206772]: 2026-01-27 21:32:36.742537061 +0000 UTC m=+0.050409717 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:32:49 compute-1 openstack_network_exporter[195945]: ERROR   21:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:32:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:32:49 compute-1 openstack_network_exporter[195945]: ERROR   21:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:32:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:32:54 compute-1 podman[206796]: 2026-01-27 21:32:54.841508476 +0000 UTC m=+0.150271525 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 27 21:32:57 compute-1 podman[206823]: 2026-01-27 21:32:57.753023314 +0000 UTC m=+0.054555122 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 27 21:32:57 compute-1 podman[206822]: 2026-01-27 21:32:57.758496669 +0000 UTC m=+0.063930913 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Jan 27 21:33:04 compute-1 nova_compute[183751]: 2026-01-27 21:33:04.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:04 compute-1 nova_compute[183751]: 2026-01-27 21:33:04.683 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:05 compute-1 podman[193064]: time="2026-01-27T21:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:33:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:33:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2159 "" "Go-http-client/1.1"
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.668 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:33:05 compute-1 sshd-session[206862]: Invalid user ubuntu from 80.94.92.186 port 52278
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.823 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.824 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.859 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.859 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6168MB free_disk=73.18217849731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.860 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:33:05 compute-1 nova_compute[183751]: 2026-01-27 21:33:05.860 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:33:06 compute-1 sshd-session[206862]: Connection closed by invalid user ubuntu 80.94.92.186 port 52278 [preauth]
Jan 27 21:33:06 compute-1 nova_compute[183751]: 2026-01-27 21:33:06.945 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:33:06 compute-1 nova_compute[183751]: 2026-01-27 21:33:06.945 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:33:05 up  1:35,  0 user,  load average: 0.00, 0.04, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:33:06 compute-1 nova_compute[183751]: 2026-01-27 21:33:06.988 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:33:07 compute-1 nova_compute[183751]: 2026-01-27 21:33:07.512 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:33:07 compute-1 podman[206866]: 2026-01-27 21:33:07.744720005 +0000 UTC m=+0.057635328 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:33:08 compute-1 nova_compute[183751]: 2026-01-27 21:33:08.026 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:33:08 compute-1 nova_compute[183751]: 2026-01-27 21:33:08.026 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.166s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:33:11 compute-1 nova_compute[183751]: 2026-01-27 21:33:11.023 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:11 compute-1 nova_compute[183751]: 2026-01-27 21:33:11.023 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:11 compute-1 nova_compute[183751]: 2026-01-27 21:33:11.023 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:11 compute-1 nova_compute[183751]: 2026-01-27 21:33:11.023 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:33:11 compute-1 nova_compute[183751]: 2026-01-27 21:33:11.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:33:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:33:11.187 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:33:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:33:11.188 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:33:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:33:11.188 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:33:19 compute-1 openstack_network_exporter[195945]: ERROR   21:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:33:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:33:19 compute-1 openstack_network_exporter[195945]: ERROR   21:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:33:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:33:25 compute-1 podman[206893]: 2026-01-27 21:33:25.786711481 +0000 UTC m=+0.090037574 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 27 21:33:28 compute-1 podman[206921]: 2026-01-27 21:33:28.740715032 +0000 UTC m=+0.047450227 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 27 21:33:28 compute-1 podman[206920]: 2026-01-27 21:33:28.761686198 +0000 UTC m=+0.077283701 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:33:35 compute-1 podman[193064]: time="2026-01-27T21:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:33:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:33:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Jan 27 21:33:38 compute-1 podman[206959]: 2026-01-27 21:33:38.78275995 +0000 UTC m=+0.083955215 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:33:49 compute-1 openstack_network_exporter[195945]: ERROR   21:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:33:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:33:49 compute-1 openstack_network_exporter[195945]: ERROR   21:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:33:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:33:56 compute-1 podman[206983]: 2026-01-27 21:33:56.790164696 +0000 UTC m=+0.096458791 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 27 21:33:59 compute-1 podman[207010]: 2026-01-27 21:33:59.745971452 +0000 UTC m=+0.058423607 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:33:59 compute-1 podman[207011]: 2026-01-27 21:33:59.765867591 +0000 UTC m=+0.063887511 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:05 compute-1 podman[193064]: time="2026-01-27T21:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:34:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:34:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2161 "" "Go-http-client/1.1"
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.875 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.877 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.910 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.911 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6182MB free_disk=73.1819953918457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.912 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:34:05 compute-1 nova_compute[183751]: 2026-01-27 21:34:05.912 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:34:06 compute-1 nova_compute[183751]: 2026-01-27 21:34:06.958 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:34:06 compute-1 nova_compute[183751]: 2026-01-27 21:34:06.959 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:34:05 up  1:36,  0 user,  load average: 0.00, 0.03, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:34:06 compute-1 nova_compute[183751]: 2026-01-27 21:34:06.978 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:34:07 compute-1 nova_compute[183751]: 2026-01-27 21:34:07.486 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:34:07 compute-1 nova_compute[183751]: 2026-01-27 21:34:07.996 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:34:07 compute-1 nova_compute[183751]: 2026-01-27 21:34:07.997 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.085s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:34:09 compute-1 podman[207051]: 2026-01-27 21:34:09.747442823 +0000 UTC m=+0.059670177 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:34:10 compute-1 nova_compute[183751]: 2026-01-27 21:34:10.993 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:10 compute-1 nova_compute[183751]: 2026-01-27 21:34:10.994 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:11 compute-1 nova_compute[183751]: 2026-01-27 21:34:11.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:11 compute-1 nova_compute[183751]: 2026-01-27 21:34:11.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:34:11 compute-1 nova_compute[183751]: 2026-01-27 21:34:11.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:34:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:34:11.189 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:34:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:34:11.189 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:34:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:34:11.189 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:34:19 compute-1 openstack_network_exporter[195945]: ERROR   21:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:34:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:34:19 compute-1 openstack_network_exporter[195945]: ERROR   21:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:34:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:34:27 compute-1 podman[207078]: 2026-01-27 21:34:27.823381002 +0000 UTC m=+0.129762489 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 21:34:30 compute-1 podman[207105]: 2026-01-27 21:34:30.751109949 +0000 UTC m=+0.058260282 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Jan 27 21:34:30 compute-1 podman[207106]: 2026-01-27 21:34:30.754469572 +0000 UTC m=+0.052914872 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 21:34:35 compute-1 podman[193064]: time="2026-01-27T21:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:34:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:34:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2159 "" "Go-http-client/1.1"
Jan 27 21:34:40 compute-1 podman[207146]: 2026-01-27 21:34:40.741032837 +0000 UTC m=+0.052466780 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:34:49 compute-1 openstack_network_exporter[195945]: ERROR   21:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:34:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:34:49 compute-1 openstack_network_exporter[195945]: ERROR   21:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:34:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:34:58 compute-1 podman[207170]: 2026-01-27 21:34:58.833700858 +0000 UTC m=+0.137165343 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Jan 27 21:35:01 compute-1 podman[207197]: 2026-01-27 21:35:01.757954308 +0000 UTC m=+0.075622330 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 27 21:35:01 compute-1 podman[207198]: 2026-01-27 21:35:01.778599175 +0000 UTC m=+0.089462909 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 27 21:35:05 compute-1 nova_compute[183751]: 2026-01-27 21:35:05.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:05 compute-1 nova_compute[183751]: 2026-01-27 21:35:05.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:05 compute-1 podman[193064]: time="2026-01-27T21:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:35:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:35:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2156 "" "Go-http-client/1.1"
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.673 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.674 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.674 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.675 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.872 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.873 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.906 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.906 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6175MB free_disk=73.1819953918457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.907 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:35:06 compute-1 nova_compute[183751]: 2026-01-27 21:35:06.907 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:35:07 compute-1 nova_compute[183751]: 2026-01-27 21:35:07.956 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:35:07 compute-1 nova_compute[183751]: 2026-01-27 21:35:07.956 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:35:06 up  1:37,  0 user,  load average: 0.07, 0.04, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:35:07 compute-1 nova_compute[183751]: 2026-01-27 21:35:07.974 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:35:08 compute-1 nova_compute[183751]: 2026-01-27 21:35:08.480 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:35:08 compute-1 nova_compute[183751]: 2026-01-27 21:35:08.992 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:35:08 compute-1 nova_compute[183751]: 2026-01-27 21:35:08.993 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:35:10 compute-1 nova_compute[183751]: 2026-01-27 21:35:10.989 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:10 compute-1 nova_compute[183751]: 2026-01-27 21:35:10.990 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:35:11.190 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:35:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:35:11.190 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:35:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:35:11.191 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:35:11 compute-1 nova_compute[183751]: 2026-01-27 21:35:11.500 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:11 compute-1 podman[207238]: 2026-01-27 21:35:11.744981943 +0000 UTC m=+0.060288102 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:35:12 compute-1 nova_compute[183751]: 2026-01-27 21:35:12.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:12 compute-1 nova_compute[183751]: 2026-01-27 21:35:12.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:35:12 compute-1 nova_compute[183751]: 2026-01-27 21:35:12.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:35:19 compute-1 openstack_network_exporter[195945]: ERROR   21:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:35:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:35:19 compute-1 openstack_network_exporter[195945]: ERROR   21:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:35:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:35:29 compute-1 podman[207262]: 2026-01-27 21:35:29.809946944 +0000 UTC m=+0.105780501 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:35:32 compute-1 podman[207289]: 2026-01-27 21:35:32.797377827 +0000 UTC m=+0.108094628 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:35:32 compute-1 podman[207288]: 2026-01-27 21:35:32.797813607 +0000 UTC m=+0.111286556 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter)
Jan 27 21:35:35 compute-1 podman[193064]: time="2026-01-27T21:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:35:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:35:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2160 "" "Go-http-client/1.1"
Jan 27 21:35:42 compute-1 podman[207328]: 2026-01-27 21:35:42.754707921 +0000 UTC m=+0.065401548 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:35:49 compute-1 openstack_network_exporter[195945]: ERROR   21:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:35:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:35:49 compute-1 openstack_network_exporter[195945]: ERROR   21:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:35:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:36:00 compute-1 podman[207352]: 2026-01-27 21:36:00.786103788 +0000 UTC m=+0.097494517 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:36:03 compute-1 podman[207379]: 2026-01-27 21:36:03.771685557 +0000 UTC m=+0.080926989 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:36:03 compute-1 podman[207378]: 2026-01-27 21:36:03.789413543 +0000 UTC m=+0.097475566 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc.)
Jan 27 21:36:05 compute-1 nova_compute[183751]: 2026-01-27 21:36:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:05 compute-1 podman[193064]: time="2026-01-27T21:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:36:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:36:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Jan 27 21:36:06 compute-1 nova_compute[183751]: 2026-01-27 21:36:06.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:07 compute-1 nova_compute[183751]: 2026-01-27 21:36:07.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.793 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.794 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.795 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.795 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.948 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.949 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.982 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.982 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6176MB free_disk=73.1819953918457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.982 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:36:08 compute-1 nova_compute[183751]: 2026-01-27 21:36:08.983 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:36:10 compute-1 nova_compute[183751]: 2026-01-27 21:36:10.031 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:36:10 compute-1 nova_compute[183751]: 2026-01-27 21:36:10.032 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:36:08 up  1:38,  0 user,  load average: 0.02, 0.03, 0.14\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:36:10 compute-1 nova_compute[183751]: 2026-01-27 21:36:10.057 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:36:10 compute-1 nova_compute[183751]: 2026-01-27 21:36:10.563 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:36:11 compute-1 nova_compute[183751]: 2026-01-27 21:36:11.074 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:36:11 compute-1 nova_compute[183751]: 2026-01-27 21:36:11.075 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:36:11 compute-1 nova_compute[183751]: 2026-01-27 21:36:11.075 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:11 compute-1 nova_compute[183751]: 2026-01-27 21:36:11.075 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:36:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:36:11.192 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:36:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:36:11.192 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:36:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:36:11.192 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:36:12 compute-1 nova_compute[183751]: 2026-01-27 21:36:12.658 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:12 compute-1 nova_compute[183751]: 2026-01-27 21:36:12.659 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:12 compute-1 nova_compute[183751]: 2026-01-27 21:36:12.659 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:12 compute-1 nova_compute[183751]: 2026-01-27 21:36:12.659 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:12 compute-1 nova_compute[183751]: 2026-01-27 21:36:12.660 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:36:13 compute-1 nova_compute[183751]: 2026-01-27 21:36:13.169 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:36:13 compute-1 nova_compute[183751]: 2026-01-27 21:36:13.658 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:13 compute-1 nova_compute[183751]: 2026-01-27 21:36:13.659 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:36:13 compute-1 podman[207418]: 2026-01-27 21:36:13.737702677 +0000 UTC m=+0.050613685 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:36:16 compute-1 nova_compute[183751]: 2026-01-27 21:36:16.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:36:19 compute-1 openstack_network_exporter[195945]: ERROR   21:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:36:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:36:19 compute-1 openstack_network_exporter[195945]: ERROR   21:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:36:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:36:31 compute-1 podman[207442]: 2026-01-27 21:36:31.806438191 +0000 UTC m=+0.102073290 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Jan 27 21:36:34 compute-1 podman[207469]: 2026-01-27 21:36:34.735199142 +0000 UTC m=+0.046299199 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 21:36:34 compute-1 podman[207468]: 2026-01-27 21:36:34.736821922 +0000 UTC m=+0.053478235 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Jan 27 21:36:35 compute-1 podman[193064]: time="2026-01-27T21:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:36:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:36:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:36:44 compute-1 podman[207503]: 2026-01-27 21:36:44.773271092 +0000 UTC m=+0.077512245 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:36:49 compute-1 openstack_network_exporter[195945]: ERROR   21:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:36:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:36:49 compute-1 openstack_network_exporter[195945]: ERROR   21:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:36:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:37:02 compute-1 podman[207527]: 2026-01-27 21:37:02.831583656 +0000 UTC m=+0.143073217 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 27 21:37:05 compute-1 podman[193064]: time="2026-01-27T21:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:37:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:37:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 27 21:37:05 compute-1 podman[207555]: 2026-01-27 21:37:05.769913893 +0000 UTC m=+0.068986167 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 27 21:37:05 compute-1 podman[207554]: 2026-01-27 21:37:05.78485258 +0000 UTC m=+0.084287413 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 27 21:37:06 compute-1 nova_compute[183751]: 2026-01-27 21:37:06.653 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:07 compute-1 nova_compute[183751]: 2026-01-27 21:37:07.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:08 compute-1 nova_compute[183751]: 2026-01-27 21:37:08.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.822 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.823 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.844 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.845 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6174MB free_disk=73.1819953918457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.845 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:37:09 compute-1 nova_compute[183751]: 2026-01-27 21:37:09.845 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.093 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.094 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:37:09 up  1:39,  0 user,  load average: 0.01, 0.02, 0.13\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:37:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:37:11.193 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:37:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:37:11.193 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:37:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:37:11.193 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.214 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.307 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.307 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.326 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.351 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.372 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:37:11 compute-1 nova_compute[183751]: 2026-01-27 21:37:11.888 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:37:12 compute-1 nova_compute[183751]: 2026-01-27 21:37:12.399 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:37:12 compute-1 nova_compute[183751]: 2026-01-27 21:37:12.400 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.554s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:37:14 compute-1 nova_compute[183751]: 2026-01-27 21:37:14.396 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:14 compute-1 nova_compute[183751]: 2026-01-27 21:37:14.397 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:15 compute-1 nova_compute[183751]: 2026-01-27 21:37:15.012 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:15 compute-1 nova_compute[183751]: 2026-01-27 21:37:15.013 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:15 compute-1 nova_compute[183751]: 2026-01-27 21:37:15.014 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:37:15 compute-1 nova_compute[183751]: 2026-01-27 21:37:15.014 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:37:15 compute-1 podman[207594]: 2026-01-27 21:37:15.775955203 +0000 UTC m=+0.079178911 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:37:19 compute-1 openstack_network_exporter[195945]: ERROR   21:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:37:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:37:19 compute-1 openstack_network_exporter[195945]: ERROR   21:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:37:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:37:33 compute-1 podman[207619]: 2026-01-27 21:37:33.761683541 +0000 UTC m=+0.077067758 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:37:35 compute-1 podman[193064]: time="2026-01-27T21:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:37:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:37:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:37:36 compute-1 podman[207645]: 2026-01-27 21:37:36.747360416 +0000 UTC m=+0.057974548 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 27 21:37:36 compute-1 podman[207646]: 2026-01-27 21:37:36.77675291 +0000 UTC m=+0.073666765 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:37:46 compute-1 podman[207685]: 2026-01-27 21:37:46.745552186 +0000 UTC m=+0.052642817 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:37:49 compute-1 openstack_network_exporter[195945]: ERROR   21:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:37:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:37:49 compute-1 openstack_network_exporter[195945]: ERROR   21:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:37:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:37:53 compute-1 sshd-session[207709]: Invalid user ubuntu from 80.94.92.186 port 55296
Jan 27 21:37:53 compute-1 sshd-session[207709]: Connection closed by invalid user ubuntu 80.94.92.186 port 55296 [preauth]
Jan 27 21:38:04 compute-1 podman[207711]: 2026-01-27 21:38:04.787655703 +0000 UTC m=+0.088227794 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Jan 27 21:38:05 compute-1 podman[193064]: time="2026-01-27T21:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:38:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:38:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 21:38:06 compute-1 nova_compute[183751]: 2026-01-27 21:38:06.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:07 compute-1 podman[207737]: 2026-01-27 21:38:07.748776024 +0000 UTC m=+0.062031089 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Jan 27 21:38:07 compute-1 podman[207738]: 2026-01-27 21:38:07.749449021 +0000 UTC m=+0.057618791 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Jan 27 21:38:09 compute-1 nova_compute[183751]: 2026-01-27 21:38:09.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:10 compute-1 nova_compute[183751]: 2026-01-27 21:38:10.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:38:11.194 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:38:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:38:11.195 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:38:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:38:11.195 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.669 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.816 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.817 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.830 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.831 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6171MB free_disk=73.18199157714844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.831 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:38:11 compute-1 nova_compute[183751]: 2026-01-27 21:38:11.831 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:38:12 compute-1 nova_compute[183751]: 2026-01-27 21:38:12.927 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:38:12 compute-1 nova_compute[183751]: 2026-01-27 21:38:12.928 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:38:11 up  1:40,  0 user,  load average: 0.00, 0.02, 0.12\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:38:12 compute-1 nova_compute[183751]: 2026-01-27 21:38:12.963 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:38:13 compute-1 nova_compute[183751]: 2026-01-27 21:38:13.507 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:38:14 compute-1 nova_compute[183751]: 2026-01-27 21:38:14.052 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:38:14 compute-1 nova_compute[183751]: 2026-01-27 21:38:14.052 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.221s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:38:15 compute-1 nova_compute[183751]: 2026-01-27 21:38:15.048 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:15 compute-1 nova_compute[183751]: 2026-01-27 21:38:15.049 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:15 compute-1 nova_compute[183751]: 2026-01-27 21:38:15.050 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:16 compute-1 nova_compute[183751]: 2026-01-27 21:38:16.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:38:16 compute-1 nova_compute[183751]: 2026-01-27 21:38:16.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:38:17 compute-1 podman[207781]: 2026-01-27 21:38:17.770468893 +0000 UTC m=+0.073237114 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:38:19 compute-1 openstack_network_exporter[195945]: ERROR   21:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:38:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:38:19 compute-1 openstack_network_exporter[195945]: ERROR   21:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:38:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:38:35 compute-1 podman[193064]: time="2026-01-27T21:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:38:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:38:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:38:35 compute-1 podman[207807]: 2026-01-27 21:38:35.810651955 +0000 UTC m=+0.116487850 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Jan 27 21:38:38 compute-1 podman[207834]: 2026-01-27 21:38:38.770025892 +0000 UTC m=+0.080304329 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 21:38:38 compute-1 podman[207833]: 2026-01-27 21:38:38.770290909 +0000 UTC m=+0.082198296 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=)
Jan 27 21:38:48 compute-1 podman[207873]: 2026-01-27 21:38:48.733681321 +0000 UTC m=+0.049539071 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:38:49 compute-1 openstack_network_exporter[195945]: ERROR   21:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:38:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:38:49 compute-1 openstack_network_exporter[195945]: ERROR   21:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:38:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:39:05 compute-1 podman[193064]: time="2026-01-27T21:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:39:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:39:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 21:39:06 compute-1 nova_compute[183751]: 2026-01-27 21:39:06.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:06 compute-1 podman[207899]: 2026-01-27 21:39:06.787277469 +0000 UTC m=+0.099368017 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 21:39:09 compute-1 podman[207926]: 2026-01-27 21:39:09.74392845 +0000 UTC m=+0.054267927 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 27 21:39:09 compute-1 podman[207925]: 2026-01-27 21:39:09.748351379 +0000 UTC m=+0.058654125 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64)
Jan 27 21:39:11 compute-1 nova_compute[183751]: 2026-01-27 21:39:11.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:39:11.195 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:39:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:39:11.196 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:39:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:39:11.196 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:39:11 compute-1 nova_compute[183751]: 2026-01-27 21:39:11.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:11 compute-1 nova_compute[183751]: 2026-01-27 21:39:11.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.842 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.844 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.863 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.863 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6171MB free_disk=73.18168640136719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.864 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:39:12 compute-1 nova_compute[183751]: 2026-01-27 21:39:12.864 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:39:13 compute-1 nova_compute[183751]: 2026-01-27 21:39:13.925 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:39:13 compute-1 nova_compute[183751]: 2026-01-27 21:39:13.925 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:39:12 up  1:41,  0 user,  load average: 0.00, 0.01, 0.11\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:39:13 compute-1 nova_compute[183751]: 2026-01-27 21:39:13.954 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:39:14 compute-1 nova_compute[183751]: 2026-01-27 21:39:14.462 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:39:14 compute-1 nova_compute[183751]: 2026-01-27 21:39:14.972 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:39:14 compute-1 nova_compute[183751]: 2026-01-27 21:39:14.973 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.109s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:39:15 compute-1 nova_compute[183751]: 2026-01-27 21:39:15.972 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:15 compute-1 nova_compute[183751]: 2026-01-27 21:39:15.972 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:16 compute-1 nova_compute[183751]: 2026-01-27 21:39:16.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:39:16 compute-1 nova_compute[183751]: 2026-01-27 21:39:16.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:39:19 compute-1 openstack_network_exporter[195945]: ERROR   21:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:39:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:39:19 compute-1 openstack_network_exporter[195945]: ERROR   21:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:39:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:39:19 compute-1 podman[207966]: 2026-01-27 21:39:19.75595231 +0000 UTC m=+0.059616179 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:39:35 compute-1 podman[193064]: time="2026-01-27T21:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:39:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:39:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:39:37 compute-1 podman[207991]: 2026-01-27 21:39:37.766426378 +0000 UTC m=+0.084735518 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126)
Jan 27 21:39:40 compute-1 podman[208017]: 2026-01-27 21:39:40.788286164 +0000 UTC m=+0.087606329 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Jan 27 21:39:40 compute-1 podman[208018]: 2026-01-27 21:39:40.812830578 +0000 UTC m=+0.105660443 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Jan 27 21:39:49 compute-1 openstack_network_exporter[195945]: ERROR   21:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:39:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:39:49 compute-1 openstack_network_exporter[195945]: ERROR   21:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:39:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:39:50 compute-1 podman[208056]: 2026-01-27 21:39:50.792046281 +0000 UTC m=+0.100184688 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:40:05 compute-1 podman[193064]: time="2026-01-27T21:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:40:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:40:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:40:08 compute-1 nova_compute[183751]: 2026-01-27 21:40:08.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:08 compute-1 podman[208081]: 2026-01-27 21:40:08.824612243 +0000 UTC m=+0.135052227 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:40:11 compute-1 nova_compute[183751]: 2026-01-27 21:40:11.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:40:11.197 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:40:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:40:11.197 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:40:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:40:11.197 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:40:11 compute-1 podman[208109]: 2026-01-27 21:40:11.7839286 +0000 UTC m=+0.081465658 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41)
Jan 27 21:40:11 compute-1 podman[208110]: 2026-01-27 21:40:11.791274061 +0000 UTC m=+0.085455496 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.679 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.680 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.680 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.680 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.897 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.898 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.923 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.924 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6166MB free_disk=73.18168640136719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.924 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:40:12 compute-1 nova_compute[183751]: 2026-01-27 21:40:12.925 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:40:13 compute-1 nova_compute[183751]: 2026-01-27 21:40:13.978 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:40:13 compute-1 nova_compute[183751]: 2026-01-27 21:40:13.978 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:40:12 up  1:42,  0 user,  load average: 0.08, 0.04, 0.11\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:40:14 compute-1 nova_compute[183751]: 2026-01-27 21:40:14.001 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:40:14 compute-1 nova_compute[183751]: 2026-01-27 21:40:14.511 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:40:15 compute-1 nova_compute[183751]: 2026-01-27 21:40:15.026 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:40:15 compute-1 nova_compute[183751]: 2026-01-27 21:40:15.027 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:40:16 compute-1 nova_compute[183751]: 2026-01-27 21:40:16.023 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:16 compute-1 nova_compute[183751]: 2026-01-27 21:40:16.023 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:16 compute-1 nova_compute[183751]: 2026-01-27 21:40:16.023 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:17 compute-1 nova_compute[183751]: 2026-01-27 21:40:17.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:18 compute-1 nova_compute[183751]: 2026-01-27 21:40:18.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:40:18 compute-1 nova_compute[183751]: 2026-01-27 21:40:18.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:40:19 compute-1 openstack_network_exporter[195945]: ERROR   21:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:40:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:40:19 compute-1 openstack_network_exporter[195945]: ERROR   21:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:40:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:40:21 compute-1 podman[208150]: 2026-01-27 21:40:21.739365308 +0000 UTC m=+0.056883882 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:40:35 compute-1 podman[193064]: time="2026-01-27T21:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:40:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:40:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:40:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 21:40:39 compute-1 podman[208174]: 2026-01-27 21:40:39.780976361 +0000 UTC m=+0.089328201 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 27 21:40:42 compute-1 podman[208200]: 2026-01-27 21:40:42.773053263 +0000 UTC m=+0.077637553 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 27 21:40:42 compute-1 podman[208201]: 2026-01-27 21:40:42.773053143 +0000 UTC m=+0.071884361 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:40:49 compute-1 openstack_network_exporter[195945]: ERROR   21:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:40:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:40:49 compute-1 openstack_network_exporter[195945]: ERROR   21:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:40:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:40:52 compute-1 podman[208238]: 2026-01-27 21:40:52.771788055 +0000 UTC m=+0.085226150 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:41:05 compute-1 podman[193064]: time="2026-01-27T21:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:41:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:41:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 21:41:09 compute-1 nova_compute[183751]: 2026-01-27 21:41:09.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:10 compute-1 nova_compute[183751]: 2026-01-27 21:41:10.297 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:10 compute-1 podman[208262]: 2026-01-27 21:41:10.780545783 +0000 UTC m=+0.093760760 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:41:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:41:11.198 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:41:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:41:11.199 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:41:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:41:11.199 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.673 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.674 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.674 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.675 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.839 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.840 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.862 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.863 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6167MB free_disk=73.18168640136719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.863 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:41:12 compute-1 nova_compute[183751]: 2026-01-27 21:41:12.864 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:41:13 compute-1 podman[208291]: 2026-01-27 21:41:13.262248315 +0000 UTC m=+0.074107166 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal)
Jan 27 21:41:13 compute-1 podman[208292]: 2026-01-27 21:41:13.277248284 +0000 UTC m=+0.075761326 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 27 21:41:13 compute-1 nova_compute[183751]: 2026-01-27 21:41:13.954 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:41:13 compute-1 nova_compute[183751]: 2026-01-27 21:41:13.955 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:41:12 up  1:43,  0 user,  load average: 0.03, 0.03, 0.10\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:41:13 compute-1 nova_compute[183751]: 2026-01-27 21:41:13.979 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:41:14 compute-1 nova_compute[183751]: 2026-01-27 21:41:14.487 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:41:14 compute-1 nova_compute[183751]: 2026-01-27 21:41:14.997 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:41:14 compute-1 nova_compute[183751]: 2026-01-27 21:41:14.998 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.134s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:41:14 compute-1 nova_compute[183751]: 2026-01-27 21:41:14.998 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:14 compute-1 nova_compute[183751]: 2026-01-27 21:41:14.998 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:41:15 compute-1 nova_compute[183751]: 2026-01-27 21:41:15.506 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:41:16 compute-1 nova_compute[183751]: 2026-01-27 21:41:16.500 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:16 compute-1 nova_compute[183751]: 2026-01-27 21:41:16.501 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:17 compute-1 nova_compute[183751]: 2026-01-27 21:41:17.015 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:17 compute-1 nova_compute[183751]: 2026-01-27 21:41:17.015 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:17 compute-1 nova_compute[183751]: 2026-01-27 21:41:17.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:19 compute-1 nova_compute[183751]: 2026-01-27 21:41:19.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:19 compute-1 nova_compute[183751]: 2026-01-27 21:41:19.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:41:19 compute-1 openstack_network_exporter[195945]: ERROR   21:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:41:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:41:19 compute-1 openstack_network_exporter[195945]: ERROR   21:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:41:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:41:22 compute-1 nova_compute[183751]: 2026-01-27 21:41:22.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:22 compute-1 nova_compute[183751]: 2026-01-27 21:41:22.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:41:23 compute-1 podman[208331]: 2026-01-27 21:41:23.744738392 +0000 UTC m=+0.054151385 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:41:30 compute-1 nova_compute[183751]: 2026-01-27 21:41:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:41:35 compute-1 podman[193064]: time="2026-01-27T21:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:41:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:41:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2160 "" "Go-http-client/1.1"
Jan 27 21:41:41 compute-1 podman[208357]: 2026-01-27 21:41:41.838517224 +0000 UTC m=+0.135079169 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true)
Jan 27 21:41:43 compute-1 podman[208384]: 2026-01-27 21:41:43.76132001 +0000 UTC m=+0.075383924 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Jan 27 21:41:43 compute-1 podman[208385]: 2026-01-27 21:41:43.79209828 +0000 UTC m=+0.098982577 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 21:41:49 compute-1 openstack_network_exporter[195945]: ERROR   21:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:41:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:41:49 compute-1 openstack_network_exporter[195945]: ERROR   21:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:41:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:41:54 compute-1 podman[208421]: 2026-01-27 21:41:54.754632094 +0000 UTC m=+0.063467810 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:42:05 compute-1 nova_compute[183751]: 2026-01-27 21:42:05.295 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:05 compute-1 podman[193064]: time="2026-01-27T21:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:42:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:42:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:42:09 compute-1 nova_compute[183751]: 2026-01-27 21:42:09.657 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:42:11.199 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:42:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:42:11.200 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:42:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:42:11.200 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:42:12 compute-1 nova_compute[183751]: 2026-01-27 21:42:12.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:12 compute-1 podman[208446]: 2026-01-27 21:42:12.783749524 +0000 UTC m=+0.097452349 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 27 21:42:13 compute-1 nova_compute[183751]: 2026-01-27 21:42:13.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.674 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.674 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.674 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.674 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:42:14 compute-1 podman[208474]: 2026-01-27 21:42:14.793942569 +0000 UTC m=+0.081879504 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 27 21:42:14 compute-1 podman[208473]: 2026-01-27 21:42:14.793609861 +0000 UTC m=+0.087604966 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.887 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.889 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.911 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.912 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6164MB free_disk=73.18168640136719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.912 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:42:14 compute-1 nova_compute[183751]: 2026-01-27 21:42:14.913 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.351 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.352 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:42:14 up  1:44,  0 user,  load average: 0.04, 0.03, 0.09\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.482 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.642 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.643 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.659 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.686 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:42:16 compute-1 nova_compute[183751]: 2026-01-27 21:42:16.708 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:42:17 compute-1 nova_compute[183751]: 2026-01-27 21:42:17.215 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:42:17 compute-1 nova_compute[183751]: 2026-01-27 21:42:17.725 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:42:17 compute-1 nova_compute[183751]: 2026-01-27 21:42:17.725 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.812s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:42:18 compute-1 nova_compute[183751]: 2026-01-27 21:42:18.726 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:18 compute-1 nova_compute[183751]: 2026-01-27 21:42:18.727 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:18 compute-1 nova_compute[183751]: 2026-01-27 21:42:18.727 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:19 compute-1 openstack_network_exporter[195945]: ERROR   21:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:42:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:42:19 compute-1 openstack_network_exporter[195945]: ERROR   21:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:42:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:42:21 compute-1 nova_compute[183751]: 2026-01-27 21:42:21.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:42:21 compute-1 nova_compute[183751]: 2026-01-27 21:42:21.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:42:25 compute-1 podman[208514]: 2026-01-27 21:42:25.763828581 +0000 UTC m=+0.068496123 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:42:29 compute-1 sshd-session[208538]: banner exchange: Connection from 91.224.92.114 port 64835: invalid format
Jan 27 21:42:35 compute-1 podman[193064]: time="2026-01-27T21:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:42:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:42:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 27 21:42:42 compute-1 sshd-session[208539]: Invalid user ubuntu from 80.94.92.186 port 58320
Jan 27 21:42:42 compute-1 sshd-session[208539]: Connection closed by invalid user ubuntu 80.94.92.186 port 58320 [preauth]
Jan 27 21:42:43 compute-1 podman[208541]: 2026-01-27 21:42:43.831140288 +0000 UTC m=+0.130402104 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:42:45 compute-1 podman[208568]: 2026-01-27 21:42:45.767054087 +0000 UTC m=+0.074955913 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Jan 27 21:42:45 compute-1 podman[208569]: 2026-01-27 21:42:45.77607637 +0000 UTC m=+0.069503198 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 27 21:42:49 compute-1 openstack_network_exporter[195945]: ERROR   21:42:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:42:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:42:49 compute-1 openstack_network_exporter[195945]: ERROR   21:42:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:42:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:42:56 compute-1 podman[208610]: 2026-01-27 21:42:56.741096156 +0000 UTC m=+0.055736318 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:43:05 compute-1 podman[193064]: time="2026-01-27T21:43:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:43:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:43:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:43:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:43:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 21:43:10 compute-1 nova_compute[183751]: 2026-01-27 21:43:10.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:43:11.201 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:43:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:43:11.201 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:43:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:43:11.202 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:43:12 compute-1 nova_compute[183751]: 2026-01-27 21:43:12.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:43:14 compute-1 podman[208634]: 2026-01-27 21:43:14.84761128 +0000 UTC m=+0.146492462 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.868 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.869 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.882 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.882 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6174MB free_disk=73.18168640136719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.882 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:43:14 compute-1 nova_compute[183751]: 2026-01-27 21:43:14.882 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:43:15 compute-1 nova_compute[183751]: 2026-01-27 21:43:15.948 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:43:15 compute-1 nova_compute[183751]: 2026-01-27 21:43:15.948 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:43:14 up  1:45,  0 user,  load average: 0.01, 0.02, 0.08\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:43:15 compute-1 nova_compute[183751]: 2026-01-27 21:43:15.973 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:43:16 compute-1 nova_compute[183751]: 2026-01-27 21:43:16.482 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:43:16 compute-1 podman[208663]: 2026-01-27 21:43:16.78031269 +0000 UTC m=+0.083557896 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 27 21:43:16 compute-1 podman[208662]: 2026-01-27 21:43:16.788731218 +0000 UTC m=+0.086427157 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 27 21:43:16 compute-1 nova_compute[183751]: 2026-01-27 21:43:16.993 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:43:16 compute-1 nova_compute[183751]: 2026-01-27 21:43:16.993 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:43:17 compute-1 nova_compute[183751]: 2026-01-27 21:43:17.989 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:17 compute-1 nova_compute[183751]: 2026-01-27 21:43:17.990 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:19 compute-1 nova_compute[183751]: 2026-01-27 21:43:19.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:19 compute-1 nova_compute[183751]: 2026-01-27 21:43:19.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:19 compute-1 openstack_network_exporter[195945]: ERROR   21:43:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:43:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:43:19 compute-1 openstack_network_exporter[195945]: ERROR   21:43:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:43:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:43:20 compute-1 nova_compute[183751]: 2026-01-27 21:43:20.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:22 compute-1 nova_compute[183751]: 2026-01-27 21:43:22.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:43:22 compute-1 nova_compute[183751]: 2026-01-27 21:43:22.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:43:27 compute-1 podman[208703]: 2026-01-27 21:43:27.787535027 +0000 UTC m=+0.083868153 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:43:35 compute-1 podman[193064]: time="2026-01-27T21:43:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:43:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:43:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:43:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:43:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:43:45 compute-1 podman[208727]: 2026-01-27 21:43:45.826602802 +0000 UTC m=+0.138153585 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 27 21:43:47 compute-1 podman[208754]: 2026-01-27 21:43:47.751739355 +0000 UTC m=+0.059402039 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:43:47 compute-1 podman[208753]: 2026-01-27 21:43:47.764916991 +0000 UTC m=+0.080803298 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Jan 27 21:43:49 compute-1 openstack_network_exporter[195945]: ERROR   21:43:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:43:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:43:49 compute-1 openstack_network_exporter[195945]: ERROR   21:43:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:43:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:43:58 compute-1 podman[208792]: 2026-01-27 21:43:58.764054679 +0000 UTC m=+0.064339071 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:44:05 compute-1 podman[193064]: time="2026-01-27T21:44:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:44:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:44:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:44:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:44:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2160 "" "Go-http-client/1.1"
Jan 27 21:44:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:44:11.202 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:44:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:44:11.203 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:44:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:44:11.203 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:44:12 compute-1 nova_compute[183751]: 2026-01-27 21:44:12.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:14 compute-1 nova_compute[183751]: 2026-01-27 21:44:14.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.146 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.841 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.843 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.869 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.871 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6169MB free_disk=73.18212127685547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.871 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:44:15 compute-1 nova_compute[183751]: 2026-01-27 21:44:15.872 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:44:16 compute-1 podman[208820]: 2026-01-27 21:44:16.812784326 +0000 UTC m=+0.117685660 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 27 21:44:16 compute-1 nova_compute[183751]: 2026-01-27 21:44:16.928 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:44:16 compute-1 nova_compute[183751]: 2026-01-27 21:44:16.929 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:44:15 up  1:46,  0 user,  load average: 0.00, 0.02, 0.08\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:44:16 compute-1 nova_compute[183751]: 2026-01-27 21:44:16.953 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:44:17 compute-1 nova_compute[183751]: 2026-01-27 21:44:17.461 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:44:17 compute-1 nova_compute[183751]: 2026-01-27 21:44:17.977 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:44:17 compute-1 nova_compute[183751]: 2026-01-27 21:44:17.978 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:44:18 compute-1 podman[208849]: 2026-01-27 21:44:18.736630907 +0000 UTC m=+0.049834423 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 21:44:18 compute-1 podman[208848]: 2026-01-27 21:44:18.737498108 +0000 UTC m=+0.052523519 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 27 21:44:19 compute-1 openstack_network_exporter[195945]: ERROR   21:44:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:44:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:44:19 compute-1 openstack_network_exporter[195945]: ERROR   21:44:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:44:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:44:19 compute-1 nova_compute[183751]: 2026-01-27 21:44:19.980 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:19 compute-1 nova_compute[183751]: 2026-01-27 21:44:19.980 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:21 compute-1 nova_compute[183751]: 2026-01-27 21:44:21.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:24 compute-1 nova_compute[183751]: 2026-01-27 21:44:24.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:44:24 compute-1 nova_compute[183751]: 2026-01-27 21:44:24.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:44:29 compute-1 podman[208888]: 2026-01-27 21:44:29.747431382 +0000 UTC m=+0.062651489 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:44:35 compute-1 podman[193064]: time="2026-01-27T21:44:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:44:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:44:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:44:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:44:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:44:47 compute-1 podman[208915]: 2026-01-27 21:44:47.793296318 +0000 UTC m=+0.101154800 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:44:49 compute-1 openstack_network_exporter[195945]: ERROR   21:44:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:44:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:44:49 compute-1 openstack_network_exporter[195945]: ERROR   21:44:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:44:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:44:49 compute-1 podman[208941]: 2026-01-27 21:44:49.747210054 +0000 UTC m=+0.060125657 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=)
Jan 27 21:44:49 compute-1 podman[208942]: 2026-01-27 21:44:49.761933208 +0000 UTC m=+0.059976344 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:45:00 compute-1 podman[208981]: 2026-01-27 21:45:00.732769906 +0000 UTC m=+0.051599346 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:45:05 compute-1 podman[193064]: time="2026-01-27T21:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:45:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:45:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 21:45:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:45:11.204 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:45:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:45:11.204 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:45:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:45:11.204 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:45:12 compute-1 nova_compute[183751]: 2026-01-27 21:45:12.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:14 compute-1 nova_compute[183751]: 2026-01-27 21:45:14.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.833 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.834 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.856 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.857 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6167MB free_disk=73.18211364746094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.857 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:45:16 compute-1 nova_compute[183751]: 2026-01-27 21:45:16.858 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:45:17 compute-1 nova_compute[183751]: 2026-01-27 21:45:17.914 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:45:17 compute-1 nova_compute[183751]: 2026-01-27 21:45:17.914 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:45:16 up  1:47,  0 user,  load average: 0.00, 0.01, 0.07\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:45:17 compute-1 nova_compute[183751]: 2026-01-27 21:45:17.931 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:45:18 compute-1 nova_compute[183751]: 2026-01-27 21:45:18.440 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:45:18 compute-1 podman[209007]: 2026-01-27 21:45:18.856090564 +0000 UTC m=+0.167334796 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 27 21:45:18 compute-1 nova_compute[183751]: 2026-01-27 21:45:18.951 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:45:18 compute-1 nova_compute[183751]: 2026-01-27 21:45:18.952 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:45:19 compute-1 openstack_network_exporter[195945]: ERROR   21:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:45:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:45:19 compute-1 openstack_network_exporter[195945]: ERROR   21:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:45:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:45:20 compute-1 podman[209035]: 2026-01-27 21:45:20.74409834 +0000 UTC m=+0.055060852 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 21:45:20 compute-1 podman[209034]: 2026-01-27 21:45:20.76395132 +0000 UTC m=+0.073212110 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:45:20 compute-1 nova_compute[183751]: 2026-01-27 21:45:20.952 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:21 compute-1 nova_compute[183751]: 2026-01-27 21:45:21.468 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:21 compute-1 nova_compute[183751]: 2026-01-27 21:45:21.468 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:22 compute-1 nova_compute[183751]: 2026-01-27 21:45:22.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:25 compute-1 nova_compute[183751]: 2026-01-27 21:45:25.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:45:25 compute-1 nova_compute[183751]: 2026-01-27 21:45:25.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:45:31 compute-1 podman[209073]: 2026-01-27 21:45:31.76799441 +0000 UTC m=+0.072232576 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:45:35 compute-1 podman[193064]: time="2026-01-27T21:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:45:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:45:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 21:45:49 compute-1 openstack_network_exporter[195945]: ERROR   21:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:45:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:45:49 compute-1 openstack_network_exporter[195945]: ERROR   21:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:45:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:45:49 compute-1 podman[209098]: 2026-01-27 21:45:49.579924873 +0000 UTC m=+0.123676724 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:45:51 compute-1 podman[209125]: 2026-01-27 21:45:51.776941487 +0000 UTC m=+0.071581444 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126)
Jan 27 21:45:51 compute-1 podman[209124]: 2026-01-27 21:45:51.779386867 +0000 UTC m=+0.069363409 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 27 21:46:02 compute-1 podman[209165]: 2026-01-27 21:46:02.776167238 +0000 UTC m=+0.085999402 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:46:05 compute-1 podman[193064]: time="2026-01-27T21:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:46:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:46:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 21:46:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:46:11.205 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:46:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:46:11.206 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:46:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:46:11.206 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:46:14 compute-1 nova_compute[183751]: 2026-01-27 21:46:14.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:15 compute-1 nova_compute[183751]: 2026-01-27 21:46:15.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:17 compute-1 nova_compute[183751]: 2026-01-27 21:46:17.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.670 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.673 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.673 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.842 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.843 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.863 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.864 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6168MB free_disk=73.18211364746094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.864 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:46:18 compute-1 nova_compute[183751]: 2026-01-27 21:46:18.864 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:46:19 compute-1 openstack_network_exporter[195945]: ERROR   21:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:46:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:46:19 compute-1 openstack_network_exporter[195945]: ERROR   21:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:46:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:46:19 compute-1 podman[209192]: 2026-01-27 21:46:19.80510102 +0000 UTC m=+0.116227670 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:46:19 compute-1 nova_compute[183751]: 2026-01-27 21:46:19.950 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:46:19 compute-1 nova_compute[183751]: 2026-01-27 21:46:19.951 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:46:18 up  1:48,  0 user,  load average: 0.00, 0.01, 0.07\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:46:19 compute-1 nova_compute[183751]: 2026-01-27 21:46:19.972 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:46:20 compute-1 nova_compute[183751]: 2026-01-27 21:46:20.480 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:46:20 compute-1 nova_compute[183751]: 2026-01-27 21:46:20.992 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:46:20 compute-1 nova_compute[183751]: 2026-01-27 21:46:20.993 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:46:21 compute-1 nova_compute[183751]: 2026-01-27 21:46:21.994 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:21 compute-1 nova_compute[183751]: 2026-01-27 21:46:21.994 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:22 compute-1 podman[209219]: 2026-01-27 21:46:22.748257614 +0000 UTC m=+0.056835229 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 27 21:46:22 compute-1 podman[209218]: 2026-01-27 21:46:22.75292615 +0000 UTC m=+0.060338626 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, version=9.6, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64)
Jan 27 21:46:24 compute-1 nova_compute[183751]: 2026-01-27 21:46:24.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:26 compute-1 nova_compute[183751]: 2026-01-27 21:46:26.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:26 compute-1 nova_compute[183751]: 2026-01-27 21:46:26.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:46:26 compute-1 nova_compute[183751]: 2026-01-27 21:46:26.657 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:46:27 compute-1 nova_compute[183751]: 2026-01-27 21:46:27.658 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:27 compute-1 nova_compute[183751]: 2026-01-27 21:46:27.658 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:46:33 compute-1 podman[209258]: 2026-01-27 21:46:33.785002504 +0000 UTC m=+0.075249045 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:46:35 compute-1 podman[193064]: time="2026-01-27T21:46:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:46:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:46:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:46:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:46:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 21:46:36 compute-1 nova_compute[183751]: 2026-01-27 21:46:36.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:36 compute-1 nova_compute[183751]: 2026-01-27 21:46:36.150 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:46:44 compute-1 nova_compute[183751]: 2026-01-27 21:46:44.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:46:49 compute-1 openstack_network_exporter[195945]: ERROR   21:46:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:46:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:46:49 compute-1 openstack_network_exporter[195945]: ERROR   21:46:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:46:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:46:50 compute-1 podman[209282]: 2026-01-27 21:46:50.804130913 +0000 UTC m=+0.113790369 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 21:46:53 compute-1 podman[209308]: 2026-01-27 21:46:53.773125677 +0000 UTC m=+0.076561477 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 27 21:46:53 compute-1 podman[209309]: 2026-01-27 21:46:53.783374421 +0000 UTC m=+0.080926225 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 21:47:04 compute-1 podman[209347]: 2026-01-27 21:47:04.777429646 +0000 UTC m=+0.076595418 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:47:05 compute-1 podman[193064]: time="2026-01-27T21:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:47:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:47:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:47:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:47:11.207 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:47:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:47:11.207 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:47:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:47:11.207 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:47:15 compute-1 nova_compute[183751]: 2026-01-27 21:47:15.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:16 compute-1 nova_compute[183751]: 2026-01-27 21:47:16.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:19 compute-1 openstack_network_exporter[195945]: ERROR   21:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:47:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:47:19 compute-1 openstack_network_exporter[195945]: ERROR   21:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:47:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.667 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.825 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.826 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.842 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.843 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6166MB free_disk=73.18205261230469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.843 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:47:19 compute-1 nova_compute[183751]: 2026-01-27 21:47:19.844 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:47:20 compute-1 nova_compute[183751]: 2026-01-27 21:47:20.989 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:47:20 compute-1 nova_compute[183751]: 2026-01-27 21:47:20.990 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:47:19 up  1:49,  0 user,  load average: 0.00, 0.00, 0.06\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:47:21 compute-1 nova_compute[183751]: 2026-01-27 21:47:21.089 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:47:21 compute-1 nova_compute[183751]: 2026-01-27 21:47:21.199 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:47:21 compute-1 nova_compute[183751]: 2026-01-27 21:47:21.200 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:47:21 compute-1 nova_compute[183751]: 2026-01-27 21:47:21.213 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:47:21 compute-1 nova_compute[183751]: 2026-01-27 21:47:21.232 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:47:21 compute-1 nova_compute[183751]: 2026-01-27 21:47:21.260 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:47:21 compute-1 nova_compute[183751]: 2026-01-27 21:47:21.770 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:47:21 compute-1 podman[209373]: 2026-01-27 21:47:21.80882319 +0000 UTC m=+0.123618803 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Jan 27 21:47:22 compute-1 nova_compute[183751]: 2026-01-27 21:47:22.281 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:47:22 compute-1 nova_compute[183751]: 2026-01-27 21:47:22.281 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.437s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:47:24 compute-1 nova_compute[183751]: 2026-01-27 21:47:24.277 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:24 compute-1 podman[209400]: 2026-01-27 21:47:24.765255382 +0000 UTC m=+0.063552685 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 27 21:47:24 compute-1 podman[209399]: 2026-01-27 21:47:24.766733088 +0000 UTC m=+0.070430285 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Jan 27 21:47:24 compute-1 nova_compute[183751]: 2026-01-27 21:47:24.787 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:24 compute-1 nova_compute[183751]: 2026-01-27 21:47:24.788 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:26 compute-1 nova_compute[183751]: 2026-01-27 21:47:26.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:28 compute-1 nova_compute[183751]: 2026-01-27 21:47:28.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:47:28 compute-1 nova_compute[183751]: 2026-01-27 21:47:28.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:47:35 compute-1 podman[193064]: time="2026-01-27T21:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:47:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:47:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 21:47:35 compute-1 podman[209438]: 2026-01-27 21:47:35.790982941 +0000 UTC m=+0.070949139 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:47:40 compute-1 sshd-session[209462]: Invalid user jupyter from 80.94.92.186 port 33112
Jan 27 21:47:40 compute-1 sshd-session[209462]: Connection closed by invalid user jupyter 80.94.92.186 port 33112 [preauth]
Jan 27 21:47:49 compute-1 openstack_network_exporter[195945]: ERROR   21:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:47:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:47:49 compute-1 openstack_network_exporter[195945]: ERROR   21:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:47:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:47:52 compute-1 podman[209464]: 2026-01-27 21:47:52.802223266 +0000 UTC m=+0.112069128 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 21:47:55 compute-1 podman[209491]: 2026-01-27 21:47:55.762923285 +0000 UTC m=+0.062110489 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 21:47:55 compute-1 podman[209490]: 2026-01-27 21:47:55.763073289 +0000 UTC m=+0.068527149 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Jan 27 21:48:05 compute-1 podman[193064]: time="2026-01-27T21:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:48:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:48:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:48:06 compute-1 podman[209527]: 2026-01-27 21:48:06.768168654 +0000 UTC m=+0.068527428 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:48:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:48:11.209 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:48:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:48:11.209 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:48:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:48:11.209 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:48:16 compute-1 nova_compute[183751]: 2026-01-27 21:48:16.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:17 compute-1 nova_compute[183751]: 2026-01-27 21:48:17.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:19 compute-1 openstack_network_exporter[195945]: ERROR   21:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:48:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:48:19 compute-1 openstack_network_exporter[195945]: ERROR   21:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:48:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:48:20 compute-1 nova_compute[183751]: 2026-01-27 21:48:20.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.669 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.866 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.867 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.901 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.901 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6176MB free_disk=73.18205261230469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.902 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:48:21 compute-1 nova_compute[183751]: 2026-01-27 21:48:21.902 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:48:22 compute-1 nova_compute[183751]: 2026-01-27 21:48:22.956 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:48:22 compute-1 nova_compute[183751]: 2026-01-27 21:48:22.957 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:48:21 up  1:50,  0 user,  load average: 0.00, 0.00, 0.05\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:48:22 compute-1 nova_compute[183751]: 2026-01-27 21:48:22.978 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:48:23 compute-1 nova_compute[183751]: 2026-01-27 21:48:23.485 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:48:23 compute-1 podman[209553]: 2026-01-27 21:48:23.81742562 +0000 UTC m=+0.118820834 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 21:48:23 compute-1 nova_compute[183751]: 2026-01-27 21:48:23.993 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:48:23 compute-1 nova_compute[183751]: 2026-01-27 21:48:23.993 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:48:24 compute-1 nova_compute[183751]: 2026-01-27 21:48:24.993 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:24 compute-1 nova_compute[183751]: 2026-01-27 21:48:24.993 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:26 compute-1 nova_compute[183751]: 2026-01-27 21:48:26.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:26 compute-1 podman[209580]: 2026-01-27 21:48:26.760615215 +0000 UTC m=+0.061270369 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Jan 27 21:48:26 compute-1 podman[209579]: 2026-01-27 21:48:26.765158417 +0000 UTC m=+0.068998479 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 27 21:48:28 compute-1 nova_compute[183751]: 2026-01-27 21:48:28.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:48:28 compute-1 nova_compute[183751]: 2026-01-27 21:48:28.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:48:35 compute-1 podman[193064]: time="2026-01-27T21:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:48:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:48:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 21:48:37 compute-1 podman[209620]: 2026-01-27 21:48:37.776264065 +0000 UTC m=+0.083317295 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:48:49 compute-1 openstack_network_exporter[195945]: ERROR   21:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:48:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:48:49 compute-1 openstack_network_exporter[195945]: ERROR   21:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:48:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:48:54 compute-1 podman[209645]: 2026-01-27 21:48:54.846046488 +0000 UTC m=+0.152731194 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, tcib_managed=true)
Jan 27 21:48:57 compute-1 podman[209672]: 2026-01-27 21:48:57.766174692 +0000 UTC m=+0.067010071 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 21:48:57 compute-1 podman[209671]: 2026-01-27 21:48:57.770792946 +0000 UTC m=+0.078438764 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, release=1755695350)
Jan 27 21:49:05 compute-1 podman[193064]: time="2026-01-27T21:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:49:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:49:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Jan 27 21:49:08 compute-1 podman[209711]: 2026-01-27 21:49:08.776764016 +0000 UTC m=+0.083447158 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:49:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:49:11.210 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:49:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:49:11.211 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:49:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:49:11.211 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:49:16 compute-1 nova_compute[183751]: 2026-01-27 21:49:16.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:19 compute-1 nova_compute[183751]: 2026-01-27 21:49:19.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:19 compute-1 openstack_network_exporter[195945]: ERROR   21:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:49:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:49:19 compute-1 openstack_network_exporter[195945]: ERROR   21:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:49:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.852 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.853 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.890 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.891 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6170MB free_disk=73.18205261230469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.891 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:49:22 compute-1 nova_compute[183751]: 2026-01-27 21:49:22.892 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:49:23 compute-1 nova_compute[183751]: 2026-01-27 21:49:23.942 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:49:23 compute-1 nova_compute[183751]: 2026-01-27 21:49:23.943 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:49:22 up  1:51,  0 user,  load average: 0.00, 0.00, 0.05\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:49:23 compute-1 nova_compute[183751]: 2026-01-27 21:49:23.968 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:49:24 compute-1 nova_compute[183751]: 2026-01-27 21:49:24.476 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:49:24 compute-1 nova_compute[183751]: 2026-01-27 21:49:24.984 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:49:24 compute-1 nova_compute[183751]: 2026-01-27 21:49:24.985 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:49:25 compute-1 podman[209737]: 2026-01-27 21:49:25.812277781 +0000 UTC m=+0.126339120 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Jan 27 21:49:25 compute-1 nova_compute[183751]: 2026-01-27 21:49:25.986 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:27 compute-1 nova_compute[183751]: 2026-01-27 21:49:27.010 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:27 compute-1 nova_compute[183751]: 2026-01-27 21:49:27.011 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:27 compute-1 nova_compute[183751]: 2026-01-27 21:49:27.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:28 compute-1 nova_compute[183751]: 2026-01-27 21:49:28.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:49:28 compute-1 nova_compute[183751]: 2026-01-27 21:49:28.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:49:28 compute-1 podman[209763]: 2026-01-27 21:49:28.772129108 +0000 UTC m=+0.080402672 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public)
Jan 27 21:49:28 compute-1 podman[209764]: 2026-01-27 21:49:28.789230992 +0000 UTC m=+0.093231411 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS)
Jan 27 21:49:35 compute-1 podman[193064]: time="2026-01-27T21:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:49:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:49:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:49:39 compute-1 podman[209802]: 2026-01-27 21:49:39.766974241 +0000 UTC m=+0.068692803 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:49:49 compute-1 openstack_network_exporter[195945]: ERROR   21:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:49:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:49:49 compute-1 openstack_network_exporter[195945]: ERROR   21:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:49:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:49:56 compute-1 podman[209827]: 2026-01-27 21:49:56.825415485 +0000 UTC m=+0.129017477 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:49:59 compute-1 podman[209853]: 2026-01-27 21:49:59.767646426 +0000 UTC m=+0.078347631 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9-minimal, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git)
Jan 27 21:49:59 compute-1 podman[209854]: 2026-01-27 21:49:59.788060672 +0000 UTC m=+0.085060858 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 27 21:50:05 compute-1 podman[193064]: time="2026-01-27T21:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:50:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:50:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:50:10 compute-1 podman[209893]: 2026-01-27 21:50:10.742924665 +0000 UTC m=+0.056154742 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:50:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:50:11.211 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:50:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:50:11.212 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:50:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:50:11.212 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:50:16 compute-1 nova_compute[183751]: 2026-01-27 21:50:16.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:19 compute-1 nova_compute[183751]: 2026-01-27 21:50:19.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:19 compute-1 openstack_network_exporter[195945]: ERROR   21:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:50:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:50:19 compute-1 openstack_network_exporter[195945]: ERROR   21:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:50:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:50:22 compute-1 nova_compute[183751]: 2026-01-27 21:50:22.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:23 compute-1 nova_compute[183751]: 2026-01-27 21:50:23.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.659 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.659 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.660 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.660 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.825 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.826 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.856 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.857 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6172MB free_disk=73.18205261230469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.857 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:50:24 compute-1 nova_compute[183751]: 2026-01-27 21:50:24.857 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:50:25 compute-1 nova_compute[183751]: 2026-01-27 21:50:25.904 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:50:25 compute-1 nova_compute[183751]: 2026-01-27 21:50:25.905 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:50:24 up  1:52,  0 user,  load average: 0.04, 0.01, 0.05\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:50:25 compute-1 nova_compute[183751]: 2026-01-27 21:50:25.939 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:50:26 compute-1 nova_compute[183751]: 2026-01-27 21:50:26.447 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:50:26 compute-1 nova_compute[183751]: 2026-01-27 21:50:26.958 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:50:26 compute-1 nova_compute[183751]: 2026-01-27 21:50:26.958 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:50:27 compute-1 podman[209920]: 2026-01-27 21:50:27.800486686 +0000 UTC m=+0.098239195 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 27 21:50:27 compute-1 nova_compute[183751]: 2026-01-27 21:50:27.958 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:27 compute-1 nova_compute[183751]: 2026-01-27 21:50:27.959 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:30 compute-1 nova_compute[183751]: 2026-01-27 21:50:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:30 compute-1 nova_compute[183751]: 2026-01-27 21:50:30.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:50:30 compute-1 podman[209947]: 2026-01-27 21:50:30.770578237 +0000 UTC m=+0.072477036 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6)
Jan 27 21:50:30 compute-1 podman[209948]: 2026-01-27 21:50:30.78804719 +0000 UTC m=+0.085813247 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 27 21:50:35 compute-1 podman[193064]: time="2026-01-27T21:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:50:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:50:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 21:50:41 compute-1 podman[209986]: 2026-01-27 21:50:41.745589489 +0000 UTC m=+0.056051189 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.150 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.151 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.152 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.152 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.153 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.153 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:50:46 compute-1 nova_compute[183751]: 2026-01-27 21:50:46.661 183755 DEBUG nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Skipping verification, no base directory at /var/lib/nova/instances/_base _get_base /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:367
Jan 27 21:50:49 compute-1 openstack_network_exporter[195945]: ERROR   21:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:50:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:50:49 compute-1 openstack_network_exporter[195945]: ERROR   21:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:50:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:50:58 compute-1 podman[210010]: 2026-01-27 21:50:58.788198339 +0000 UTC m=+0.094585764 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:51:01 compute-1 podman[210038]: 2026-01-27 21:51:01.771729393 +0000 UTC m=+0.067286307 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 21:51:01 compute-1 podman[210037]: 2026-01-27 21:51:01.778730666 +0000 UTC m=+0.077663534 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git)
Jan 27 21:51:05 compute-1 podman[193064]: time="2026-01-27T21:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:51:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:51:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 21:51:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:51:11.213 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:51:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:51:11.213 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:51:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:51:11.213 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:51:12 compute-1 podman[210075]: 2026-01-27 21:51:12.778516792 +0000 UTC m=+0.084701619 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:51:17 compute-1 nova_compute[183751]: 2026-01-27 21:51:17.661 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:19 compute-1 openstack_network_exporter[195945]: ERROR   21:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:51:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:51:19 compute-1 openstack_network_exporter[195945]: ERROR   21:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:51:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:51:21 compute-1 nova_compute[183751]: 2026-01-27 21:51:21.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:22 compute-1 nova_compute[183751]: 2026-01-27 21:51:22.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:23 compute-1 nova_compute[183751]: 2026-01-27 21:51:23.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.838 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.839 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.865 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.866 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6176MB free_disk=73.18204879760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.866 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:51:26 compute-1 nova_compute[183751]: 2026-01-27 21:51:26.866 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:51:28 compute-1 nova_compute[183751]: 2026-01-27 21:51:28.098 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:51:28 compute-1 nova_compute[183751]: 2026-01-27 21:51:28.098 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:51:26 up  1:53,  0 user,  load average: 0.01, 0.01, 0.04\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:51:28 compute-1 nova_compute[183751]: 2026-01-27 21:51:28.129 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:51:28 compute-1 nova_compute[183751]: 2026-01-27 21:51:28.636 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:51:29 compute-1 nova_compute[183751]: 2026-01-27 21:51:29.147 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:51:29 compute-1 nova_compute[183751]: 2026-01-27 21:51:29.148 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.281s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:51:29 compute-1 nova_compute[183751]: 2026-01-27 21:51:29.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:29 compute-1 nova_compute[183751]: 2026-01-27 21:51:29.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:51:29 compute-1 nova_compute[183751]: 2026-01-27 21:51:29.655 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:51:29 compute-1 podman[210103]: 2026-01-27 21:51:29.809079784 +0000 UTC m=+0.113610765 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 27 21:51:32 compute-1 nova_compute[183751]: 2026-01-27 21:51:32.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:32 compute-1 podman[210131]: 2026-01-27 21:51:32.776247943 +0000 UTC m=+0.079660444 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:51:32 compute-1 podman[210130]: 2026-01-27 21:51:32.789257505 +0000 UTC m=+0.093292281 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Jan 27 21:51:33 compute-1 nova_compute[183751]: 2026-01-27 21:51:33.171 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:33 compute-1 nova_compute[183751]: 2026-01-27 21:51:33.172 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:33 compute-1 nova_compute[183751]: 2026-01-27 21:51:33.172 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:51:35 compute-1 podman[193064]: time="2026-01-27T21:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:51:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:51:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Jan 27 21:51:37 compute-1 nova_compute[183751]: 2026-01-27 21:51:37.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:37 compute-1 nova_compute[183751]: 2026-01-27 21:51:37.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:51:43 compute-1 podman[210170]: 2026-01-27 21:51:43.735767252 +0000 UTC m=+0.052224224 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:51:48 compute-1 nova_compute[183751]: 2026-01-27 21:51:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:51:49 compute-1 openstack_network_exporter[195945]: ERROR   21:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:51:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:51:49 compute-1 openstack_network_exporter[195945]: ERROR   21:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:51:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:52:00 compute-1 podman[210194]: 2026-01-27 21:52:00.801712331 +0000 UTC m=+0.112908878 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:52:03 compute-1 podman[210221]: 2026-01-27 21:52:03.752161885 +0000 UTC m=+0.052641985 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 27 21:52:03 compute-1 podman[210220]: 2026-01-27 21:52:03.773243827 +0000 UTC m=+0.082083864 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Jan 27 21:52:05 compute-1 podman[193064]: time="2026-01-27T21:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:52:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:52:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 21:52:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:52:11.214 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:52:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:52:11.214 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:52:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:52:11.215 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:52:13 compute-1 sshd-session[210262]: Invalid user grafana from 80.94.92.186 port 36120
Jan 27 21:52:13 compute-1 podman[210264]: 2026-01-27 21:52:13.974187529 +0000 UTC m=+0.063665168 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:52:14 compute-1 sshd-session[210262]: Connection closed by invalid user grafana 80.94.92.186 port 36120 [preauth]
Jan 27 21:52:17 compute-1 nova_compute[183751]: 2026-01-27 21:52:17.654 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:19 compute-1 openstack_network_exporter[195945]: ERROR   21:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:52:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:52:19 compute-1 openstack_network_exporter[195945]: ERROR   21:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:52:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:52:21 compute-1 nova_compute[183751]: 2026-01-27 21:52:21.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:22 compute-1 nova_compute[183751]: 2026-01-27 21:52:22.786 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:23 compute-1 nova_compute[183751]: 2026-01-27 21:52:23.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:23 compute-1 nova_compute[183751]: 2026-01-27 21:52:23.657 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.847 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.849 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.870 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.871 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6175MB free_disk=73.17814254760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.871 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:52:28 compute-1 nova_compute[183751]: 2026-01-27 21:52:28.871 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:52:29 compute-1 nova_compute[183751]: 2026-01-27 21:52:29.961 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:52:29 compute-1 nova_compute[183751]: 2026-01-27 21:52:29.962 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:52:28 up  1:54,  0 user,  load average: 0.00, 0.00, 0.04\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:52:30 compute-1 nova_compute[183751]: 2026-01-27 21:52:30.011 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:52:30 compute-1 nova_compute[183751]: 2026-01-27 21:52:30.067 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:52:30 compute-1 nova_compute[183751]: 2026-01-27 21:52:30.068 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:52:30 compute-1 nova_compute[183751]: 2026-01-27 21:52:30.094 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:52:30 compute-1 nova_compute[183751]: 2026-01-27 21:52:30.118 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:52:30 compute-1 nova_compute[183751]: 2026-01-27 21:52:30.137 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:52:30 compute-1 nova_compute[183751]: 2026-01-27 21:52:30.644 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:52:31 compute-1 nova_compute[183751]: 2026-01-27 21:52:31.157 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:52:31 compute-1 nova_compute[183751]: 2026-01-27 21:52:31.158 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.286s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:52:31 compute-1 podman[210289]: 2026-01-27 21:52:31.857711167 +0000 UTC m=+0.172375810 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 21:52:33 compute-1 nova_compute[183751]: 2026-01-27 21:52:33.159 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:33 compute-1 nova_compute[183751]: 2026-01-27 21:52:33.160 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:52:33 compute-1 nova_compute[183751]: 2026-01-27 21:52:33.160 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:52:34 compute-1 podman[210318]: 2026-01-27 21:52:34.765331601 +0000 UTC m=+0.065677687 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:52:34 compute-1 podman[210317]: 2026-01-27 21:52:34.792459243 +0000 UTC m=+0.095612549 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350)
Jan 27 21:52:35 compute-1 podman[193064]: time="2026-01-27T21:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:52:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:52:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 21:52:44 compute-1 podman[210360]: 2026-01-27 21:52:44.765461139 +0000 UTC m=+0.071167944 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:52:49 compute-1 openstack_network_exporter[195945]: ERROR   21:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:52:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:52:49 compute-1 openstack_network_exporter[195945]: ERROR   21:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:52:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:53:02 compute-1 podman[210384]: 2026-01-27 21:53:02.802770634 +0000 UTC m=+0.109711798 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 27 21:53:05 compute-1 podman[193064]: time="2026-01-27T21:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:53:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:53:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 21:53:05 compute-1 podman[210413]: 2026-01-27 21:53:05.732864834 +0000 UTC m=+0.047871846 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 27 21:53:05 compute-1 podman[210412]: 2026-01-27 21:53:05.763593025 +0000 UTC m=+0.073259035 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 27 21:53:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:53:11.216 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:53:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:53:11.216 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:53:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:53:11.216 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:53:15 compute-1 podman[210454]: 2026-01-27 21:53:15.766362938 +0000 UTC m=+0.073097191 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:53:17 compute-1 nova_compute[183751]: 2026-01-27 21:53:17.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:19 compute-1 openstack_network_exporter[195945]: ERROR   21:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:53:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:53:19 compute-1 openstack_network_exporter[195945]: ERROR   21:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:53:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:53:22 compute-1 nova_compute[183751]: 2026-01-27 21:53:22.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:23 compute-1 nova_compute[183751]: 2026-01-27 21:53:23.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:24 compute-1 nova_compute[183751]: 2026-01-27 21:53:24.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.675 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.675 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.675 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.676 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.807 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.808 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.834 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.835 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6184MB free_disk=73.17814254760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.835 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:53:30 compute-1 nova_compute[183751]: 2026-01-27 21:53:30.836 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:53:32 compute-1 nova_compute[183751]: 2026-01-27 21:53:32.106 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:53:32 compute-1 nova_compute[183751]: 2026-01-27 21:53:32.106 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:53:30 up  1:55,  0 user,  load average: 0.00, 0.00, 0.03\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:53:32 compute-1 nova_compute[183751]: 2026-01-27 21:53:32.129 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:53:32 compute-1 nova_compute[183751]: 2026-01-27 21:53:32.639 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:53:33 compute-1 nova_compute[183751]: 2026-01-27 21:53:33.152 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:53:33 compute-1 nova_compute[183751]: 2026-01-27 21:53:33.153 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.317s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:53:33 compute-1 podman[210479]: 2026-01-27 21:53:33.816845872 +0000 UTC m=+0.119645545 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Jan 27 21:53:35 compute-1 nova_compute[183751]: 2026-01-27 21:53:35.153 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:35 compute-1 podman[193064]: time="2026-01-27T21:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:53:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:53:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 21:53:35 compute-1 nova_compute[183751]: 2026-01-27 21:53:35.663 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:53:35 compute-1 nova_compute[183751]: 2026-01-27 21:53:35.663 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:53:36 compute-1 podman[210507]: 2026-01-27 21:53:36.783701703 +0000 UTC m=+0.084989996 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Jan 27 21:53:36 compute-1 podman[210508]: 2026-01-27 21:53:36.815588953 +0000 UTC m=+0.112501938 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Jan 27 21:53:46 compute-1 podman[210548]: 2026-01-27 21:53:46.750031793 +0000 UTC m=+0.058446738 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:53:49 compute-1 openstack_network_exporter[195945]: ERROR   21:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:53:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:53:49 compute-1 openstack_network_exporter[195945]: ERROR   21:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:53:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:54:04 compute-1 podman[210572]: 2026-01-27 21:54:04.795560854 +0000 UTC m=+0.106254982 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 27 21:54:05 compute-1 podman[193064]: time="2026-01-27T21:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:54:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:54:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Jan 27 21:54:07 compute-1 podman[210600]: 2026-01-27 21:54:07.758262002 +0000 UTC m=+0.061646398 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 27 21:54:07 compute-1 podman[210599]: 2026-01-27 21:54:07.792429458 +0000 UTC m=+0.096963722 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Jan 27 21:54:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:54:11.218 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:54:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:54:11.218 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:54:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:54:11.218 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:54:17 compute-1 podman[210640]: 2026-01-27 21:54:17.769179305 +0000 UTC m=+0.075638595 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 21:54:19 compute-1 nova_compute[183751]: 2026-01-27 21:54:19.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:19 compute-1 openstack_network_exporter[195945]: ERROR   21:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:54:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:54:19 compute-1 openstack_network_exporter[195945]: ERROR   21:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:54:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:54:24 compute-1 nova_compute[183751]: 2026-01-27 21:54:24.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:25 compute-1 nova_compute[183751]: 2026-01-27 21:54:25.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:25 compute-1 nova_compute[183751]: 2026-01-27 21:54:25.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:30 compute-1 nova_compute[183751]: 2026-01-27 21:54:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.676 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.677 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.677 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.677 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.860 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.861 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.878 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.879 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6172MB free_disk=73.17814254760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.879 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:54:31 compute-1 nova_compute[183751]: 2026-01-27 21:54:31.879 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:54:32 compute-1 nova_compute[183751]: 2026-01-27 21:54:32.929 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:54:32 compute-1 nova_compute[183751]: 2026-01-27 21:54:32.930 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:54:31 up  1:56,  0 user,  load average: 0.00, 0.00, 0.02\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:54:32 compute-1 nova_compute[183751]: 2026-01-27 21:54:32.951 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:54:33 compute-1 nova_compute[183751]: 2026-01-27 21:54:33.458 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:54:33 compute-1 nova_compute[183751]: 2026-01-27 21:54:33.972 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:54:33 compute-1 nova_compute[183751]: 2026-01-27 21:54:33.973 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:54:34 compute-1 nova_compute[183751]: 2026-01-27 21:54:34.972 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:54:34 compute-1 nova_compute[183751]: 2026-01-27 21:54:34.973 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:54:35 compute-1 podman[193064]: time="2026-01-27T21:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:54:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:54:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Jan 27 21:54:35 compute-1 podman[210667]: 2026-01-27 21:54:35.800662196 +0000 UTC m=+0.114391984 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 21:54:38 compute-1 podman[210694]: 2026-01-27 21:54:38.788859953 +0000 UTC m=+0.093984177 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:54:38 compute-1 podman[210695]: 2026-01-27 21:54:38.791255042 +0000 UTC m=+0.086087340 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Jan 27 21:54:48 compute-1 podman[210729]: 2026-01-27 21:54:48.736862045 +0000 UTC m=+0.055656894 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:54:49 compute-1 openstack_network_exporter[195945]: ERROR   21:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:54:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:54:49 compute-1 openstack_network_exporter[195945]: ERROR   21:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:54:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:55:05 compute-1 podman[193064]: time="2026-01-27T21:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:55:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:55:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:55:06 compute-1 podman[210755]: 2026-01-27 21:55:06.78294508 +0000 UTC m=+0.094895109 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:55:09 compute-1 podman[210782]: 2026-01-27 21:55:09.743715146 +0000 UTC m=+0.060080874 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 27 21:55:09 compute-1 podman[210783]: 2026-01-27 21:55:09.784738596 +0000 UTC m=+0.090298155 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Jan 27 21:55:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:55:11.219 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:55:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:55:11.219 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:55:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:55:11.219 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:55:19 compute-1 nova_compute[183751]: 2026-01-27 21:55:19.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:19 compute-1 openstack_network_exporter[195945]: ERROR   21:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:55:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:55:19 compute-1 openstack_network_exporter[195945]: ERROR   21:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:55:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:55:19 compute-1 podman[210822]: 2026-01-27 21:55:19.735196161 +0000 UTC m=+0.051926291 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 21:55:24 compute-1 nova_compute[183751]: 2026-01-27 21:55:24.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:27 compute-1 nova_compute[183751]: 2026-01-27 21:55:27.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:27 compute-1 nova_compute[183751]: 2026-01-27 21:55:27.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:31 compute-1 nova_compute[183751]: 2026-01-27 21:55:31.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.668 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.828 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.829 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.863 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.864 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6178MB free_disk=73.17814254760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.865 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:55:33 compute-1 nova_compute[183751]: 2026-01-27 21:55:33.865 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:55:34 compute-1 nova_compute[183751]: 2026-01-27 21:55:34.916 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:55:34 compute-1 nova_compute[183751]: 2026-01-27 21:55:34.917 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:55:33 up  1:57,  0 user,  load average: 0.00, 0.00, 0.02\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:55:34 compute-1 nova_compute[183751]: 2026-01-27 21:55:34.949 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:55:35 compute-1 nova_compute[183751]: 2026-01-27 21:55:35.457 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:55:35 compute-1 podman[193064]: time="2026-01-27T21:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:55:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:55:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 21:55:35 compute-1 nova_compute[183751]: 2026-01-27 21:55:35.970 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:55:35 compute-1 nova_compute[183751]: 2026-01-27 21:55:35.970 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:55:36 compute-1 nova_compute[183751]: 2026-01-27 21:55:36.968 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:37 compute-1 nova_compute[183751]: 2026-01-27 21:55:37.480 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:55:37 compute-1 nova_compute[183751]: 2026-01-27 21:55:37.481 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:55:37 compute-1 podman[210847]: 2026-01-27 21:55:37.790951934 +0000 UTC m=+0.094545541 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 27 21:55:40 compute-1 podman[210873]: 2026-01-27 21:55:40.74044195 +0000 UTC m=+0.058000343 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 27 21:55:40 compute-1 podman[210874]: 2026-01-27 21:55:40.770712072 +0000 UTC m=+0.081061265 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 27 21:55:49 compute-1 openstack_network_exporter[195945]: ERROR   21:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:55:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:55:49 compute-1 openstack_network_exporter[195945]: ERROR   21:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:55:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:55:50 compute-1 podman[210913]: 2026-01-27 21:55:50.734126015 +0000 UTC m=+0.051509541 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:56:05 compute-1 podman[193064]: time="2026-01-27T21:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:56:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:56:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 21:56:08 compute-1 podman[210939]: 2026-01-27 21:56:08.782155026 +0000 UTC m=+0.096624712 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 27 21:56:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:56:11.220 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:56:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:56:11.221 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:56:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:56:11.221 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:56:11 compute-1 podman[210966]: 2026-01-27 21:56:11.738140543 +0000 UTC m=+0.052979417 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 27 21:56:11 compute-1 podman[210967]: 2026-01-27 21:56:11.775810309 +0000 UTC m=+0.083742152 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 21:56:19 compute-1 openstack_network_exporter[195945]: ERROR   21:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:56:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:56:19 compute-1 openstack_network_exporter[195945]: ERROR   21:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:56:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:56:20 compute-1 nova_compute[183751]: 2026-01-27 21:56:20.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:21 compute-1 podman[211006]: 2026-01-27 21:56:21.772271727 +0000 UTC m=+0.079549828 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:56:26 compute-1 nova_compute[183751]: 2026-01-27 21:56:26.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:28 compute-1 nova_compute[183751]: 2026-01-27 21:56:28.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:29 compute-1 nova_compute[183751]: 2026-01-27 21:56:29.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:32 compute-1 nova_compute[183751]: 2026-01-27 21:56:32.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:32 compute-1 nova_compute[183751]: 2026-01-27 21:56:32.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:32 compute-1 nova_compute[183751]: 2026-01-27 21:56:32.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 21:56:32 compute-1 nova_compute[183751]: 2026-01-27 21:56:32.667 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 21:56:33 compute-1 nova_compute[183751]: 2026-01-27 21:56:33.668 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.869 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.870 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.895 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.896 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6174MB free_disk=73.17814254760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.896 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:56:34 compute-1 nova_compute[183751]: 2026-01-27 21:56:34.897 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:56:35 compute-1 podman[193064]: time="2026-01-27T21:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:56:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:56:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 21:56:35 compute-1 nova_compute[183751]: 2026-01-27 21:56:35.953 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:56:35 compute-1 nova_compute[183751]: 2026-01-27 21:56:35.954 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:56:34 up  1:58,  0 user,  load average: 0.00, 0.00, 0.01\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:56:35 compute-1 nova_compute[183751]: 2026-01-27 21:56:35.989 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:56:36 compute-1 nova_compute[183751]: 2026-01-27 21:56:36.497 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:56:37 compute-1 nova_compute[183751]: 2026-01-27 21:56:37.010 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:56:37 compute-1 nova_compute[183751]: 2026-01-27 21:56:37.010 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:56:38 compute-1 nova_compute[183751]: 2026-01-27 21:56:38.010 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:38 compute-1 nova_compute[183751]: 2026-01-27 21:56:38.011 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:56:39 compute-1 podman[211031]: 2026-01-27 21:56:39.824308939 +0000 UTC m=+0.122807813 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 21:56:40 compute-1 nova_compute[183751]: 2026-01-27 21:56:40.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:56:40 compute-1 nova_compute[183751]: 2026-01-27 21:56:40.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 21:56:42 compute-1 podman[211061]: 2026-01-27 21:56:42.750833934 +0000 UTC m=+0.059383157 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 21:56:42 compute-1 podman[211060]: 2026-01-27 21:56:42.762965475 +0000 UTC m=+0.072617355 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Jan 27 21:56:43 compute-1 sshd-session[211059]: Invalid user mapr from 80.94.92.186 port 39124
Jan 27 21:56:43 compute-1 sshd-session[211059]: Connection closed by invalid user mapr 80.94.92.186 port 39124 [preauth]
Jan 27 21:56:49 compute-1 openstack_network_exporter[195945]: ERROR   21:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:56:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:56:49 compute-1 openstack_network_exporter[195945]: ERROR   21:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:56:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:56:52 compute-1 podman[211099]: 2026-01-27 21:56:52.773218747 +0000 UTC m=+0.088950501 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:56:55 compute-1 nova_compute[183751]: 2026-01-27 21:56:55.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:05 compute-1 podman[193064]: time="2026-01-27T21:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:57:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:57:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:57:10 compute-1 podman[211124]: 2026-01-27 21:57:10.80235582 +0000 UTC m=+0.105812851 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 21:57:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:57:11.222 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:57:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:57:11.223 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:57:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:57:11.223 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:57:13 compute-1 podman[211152]: 2026-01-27 21:57:13.758926702 +0000 UTC m=+0.069623222 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest)
Jan 27 21:57:13 compute-1 podman[211151]: 2026-01-27 21:57:13.766718825 +0000 UTC m=+0.079522707 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, vcs-type=git, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 21:57:19 compute-1 openstack_network_exporter[195945]: ERROR   21:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:57:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:57:19 compute-1 openstack_network_exporter[195945]: ERROR   21:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:57:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:57:20 compute-1 nova_compute[183751]: 2026-01-27 21:57:20.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:23 compute-1 podman[211190]: 2026-01-27 21:57:23.776929983 +0000 UTC m=+0.079865336 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:57:26 compute-1 nova_compute[183751]: 2026-01-27 21:57:26.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:30 compute-1 nova_compute[183751]: 2026-01-27 21:57:30.175 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:31 compute-1 nova_compute[183751]: 2026-01-27 21:57:31.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:32 compute-1 nova_compute[183751]: 2026-01-27 21:57:32.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.871 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.873 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.909 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.910 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6163MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.910 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:57:34 compute-1 nova_compute[183751]: 2026-01-27 21:57:34.911 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:57:35 compute-1 podman[193064]: time="2026-01-27T21:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:57:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:57:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.025 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.026 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:57:34 up  1:59,  0 user,  load average: 0.00, 0.00, 0.01\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.281 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.315 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.316 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.328 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.346 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.369 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:57:36 compute-1 nova_compute[183751]: 2026-01-27 21:57:36.877 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:57:37 compute-1 nova_compute[183751]: 2026-01-27 21:57:37.387 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:57:37 compute-1 nova_compute[183751]: 2026-01-27 21:57:37.388 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.477s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:57:40 compute-1 nova_compute[183751]: 2026-01-27 21:57:40.388 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:40 compute-1 nova_compute[183751]: 2026-01-27 21:57:40.899 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:57:40 compute-1 nova_compute[183751]: 2026-01-27 21:57:40.900 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:57:41 compute-1 podman[211218]: 2026-01-27 21:57:41.808611819 +0000 UTC m=+0.120832084 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:57:44 compute-1 podman[211245]: 2026-01-27 21:57:44.742406215 +0000 UTC m=+0.059640703 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 27 21:57:44 compute-1 podman[211246]: 2026-01-27 21:57:44.755802768 +0000 UTC m=+0.067423966 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Jan 27 21:57:49 compute-1 openstack_network_exporter[195945]: ERROR   21:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:57:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:57:49 compute-1 openstack_network_exporter[195945]: ERROR   21:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:57:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:57:54 compute-1 podman[211286]: 2026-01-27 21:57:54.780031475 +0000 UTC m=+0.079689762 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:58:05 compute-1 podman[193064]: time="2026-01-27T21:58:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:58:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:58:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:58:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:58:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 21:58:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:58:11.223 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:58:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:58:11.224 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:58:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:58:11.224 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:58:12 compute-1 podman[211311]: 2026-01-27 21:58:12.850093515 +0000 UTC m=+0.164431227 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest)
Jan 27 21:58:15 compute-1 podman[211338]: 2026-01-27 21:58:15.756633943 +0000 UTC m=+0.059773445 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 27 21:58:15 compute-1 podman[211337]: 2026-01-27 21:58:15.767596686 +0000 UTC m=+0.073036685 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git)
Jan 27 21:58:19 compute-1 openstack_network_exporter[195945]: ERROR   21:58:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:58:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:58:19 compute-1 openstack_network_exporter[195945]: ERROR   21:58:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:58:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:58:22 compute-1 nova_compute[183751]: 2026-01-27 21:58:22.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:25 compute-1 podman[211377]: 2026-01-27 21:58:25.740810886 +0000 UTC m=+0.055017188 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 21:58:26 compute-1 nova_compute[183751]: 2026-01-27 21:58:26.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:32 compute-1 nova_compute[183751]: 2026-01-27 21:58:32.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:32 compute-1 nova_compute[183751]: 2026-01-27 21:58:32.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.846 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.848 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.868 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.869 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6171MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.869 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:58:34 compute-1 nova_compute[183751]: 2026-01-27 21:58:34.869 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:58:35 compute-1 podman[193064]: time="2026-01-27T21:58:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:58:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:58:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:58:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:58:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 21:58:35 compute-1 nova_compute[183751]: 2026-01-27 21:58:35.934 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:58:35 compute-1 nova_compute[183751]: 2026-01-27 21:58:35.935 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:58:34 up  2:00,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:58:35 compute-1 nova_compute[183751]: 2026-01-27 21:58:35.962 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:58:36 compute-1 nova_compute[183751]: 2026-01-27 21:58:36.470 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:58:36 compute-1 nova_compute[183751]: 2026-01-27 21:58:36.982 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:58:36 compute-1 nova_compute[183751]: 2026-01-27 21:58:36.983 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:58:37 compute-1 nova_compute[183751]: 2026-01-27 21:58:37.983 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:39 compute-1 nova_compute[183751]: 2026-01-27 21:58:39.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:58:39 compute-1 nova_compute[183751]: 2026-01-27 21:58:39.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:58:43 compute-1 podman[211402]: 2026-01-27 21:58:43.808856044 +0000 UTC m=+0.115524482 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 21:58:46 compute-1 podman[211428]: 2026-01-27 21:58:46.734537868 +0000 UTC m=+0.051282225 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Jan 27 21:58:46 compute-1 podman[211429]: 2026-01-27 21:58:46.737848411 +0000 UTC m=+0.051417319 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 21:58:49 compute-1 openstack_network_exporter[195945]: ERROR   21:58:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:58:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:58:49 compute-1 openstack_network_exporter[195945]: ERROR   21:58:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:58:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:58:56 compute-1 podman[211467]: 2026-01-27 21:58:56.799689602 +0000 UTC m=+0.107885122 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 21:59:05 compute-1 podman[193064]: time="2026-01-27T21:59:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:59:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:59:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:59:05 compute-1 podman[193064]: @ - - [27/Jan/2026:21:59:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Jan 27 21:59:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:59:11.225 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:59:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:59:11.225 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:59:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 21:59:11.225 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:59:14 compute-1 podman[211493]: 2026-01-27 21:59:14.786253077 +0000 UTC m=+0.096643853 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:59:17 compute-1 podman[211520]: 2026-01-27 21:59:17.7545128 +0000 UTC m=+0.060813893 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 21:59:17 compute-1 podman[211519]: 2026-01-27 21:59:17.790959585 +0000 UTC m=+0.097540675 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 27 21:59:19 compute-1 openstack_network_exporter[195945]: ERROR   21:59:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:59:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:59:19 compute-1 openstack_network_exporter[195945]: ERROR   21:59:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:59:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:59:24 compute-1 nova_compute[183751]: 2026-01-27 21:59:24.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:27 compute-1 nova_compute[183751]: 2026-01-27 21:59:27.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:27 compute-1 podman[211557]: 2026-01-27 21:59:27.773864474 +0000 UTC m=+0.074214415 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 21:59:32 compute-1 nova_compute[183751]: 2026-01-27 21:59:32.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:32 compute-1 nova_compute[183751]: 2026-01-27 21:59:32.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:35 compute-1 podman[193064]: time="2026-01-27T21:59:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 21:59:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:59:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 21:59:35 compute-1 podman[193064]: @ - - [27/Jan/2026:21:59:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.855 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.856 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.871 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.871 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6177MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.872 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 21:59:36 compute-1 nova_compute[183751]: 2026-01-27 21:59:36.872 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 21:59:37 compute-1 nova_compute[183751]: 2026-01-27 21:59:37.938 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 21:59:37 compute-1 nova_compute[183751]: 2026-01-27 21:59:37.939 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 21:59:36 up  2:01,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 21:59:37 compute-1 nova_compute[183751]: 2026-01-27 21:59:37.968 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 21:59:38 compute-1 nova_compute[183751]: 2026-01-27 21:59:38.476 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 21:59:38 compute-1 nova_compute[183751]: 2026-01-27 21:59:38.989 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 21:59:38 compute-1 nova_compute[183751]: 2026-01-27 21:59:38.990 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 21:59:42 compute-1 nova_compute[183751]: 2026-01-27 21:59:42.989 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:42 compute-1 nova_compute[183751]: 2026-01-27 21:59:42.990 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 21:59:44 compute-1 nova_compute[183751]: 2026-01-27 21:59:44.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 21:59:45 compute-1 podman[211582]: 2026-01-27 21:59:45.824341357 +0000 UTC m=+0.124252759 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 27 21:59:48 compute-1 podman[211610]: 2026-01-27 21:59:48.761961089 +0000 UTC m=+0.072464412 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 21:59:48 compute-1 podman[211609]: 2026-01-27 21:59:48.77452034 +0000 UTC m=+0.083248879 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 27 21:59:49 compute-1 openstack_network_exporter[195945]: ERROR   21:59:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 21:59:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:59:49 compute-1 openstack_network_exporter[195945]: ERROR   21:59:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 21:59:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 21:59:58 compute-1 podman[211649]: 2026-01-27 21:59:58.739397944 +0000 UTC m=+0.052523206 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:00:05 compute-1 podman[193064]: time="2026-01-27T22:00:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:00:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:00:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:00:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:00:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:00:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:00:11.226 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:00:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:00:11.226 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:00:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:00:11.227 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:00:16 compute-1 podman[211675]: 2026-01-27 22:00:16.818920221 +0000 UTC m=+0.122240799 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 22:00:19 compute-1 openstack_network_exporter[195945]: ERROR   22:00:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:00:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:00:19 compute-1 openstack_network_exporter[195945]: ERROR   22:00:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:00:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:00:19 compute-1 podman[211702]: 2026-01-27 22:00:19.746297487 +0000 UTC m=+0.051138181 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 22:00:19 compute-1 podman[211701]: 2026-01-27 22:00:19.762259014 +0000 UTC m=+0.070742829 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:00:25 compute-1 nova_compute[183751]: 2026-01-27 22:00:25.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:28 compute-1 nova_compute[183751]: 2026-01-27 22:00:28.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:29 compute-1 podman[211741]: 2026-01-27 22:00:29.76472714 +0000 UTC m=+0.073141609 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:00:32 compute-1 nova_compute[183751]: 2026-01-27 22:00:32.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:33 compute-1 nova_compute[183751]: 2026-01-27 22:00:33.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:35 compute-1 podman[193064]: time="2026-01-27T22:00:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:00:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:00:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:00:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:00:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.670 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.863 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.865 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.902 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.903 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6181MB free_disk=73.1781234741211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.903 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:00:37 compute-1 nova_compute[183751]: 2026-01-27 22:00:37.903 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:00:38 compute-1 nova_compute[183751]: 2026-01-27 22:00:38.951 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:00:38 compute-1 nova_compute[183751]: 2026-01-27 22:00:38.952 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:00:37 up  2:03,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:00:38 compute-1 nova_compute[183751]: 2026-01-27 22:00:38.979 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:00:39 compute-1 nova_compute[183751]: 2026-01-27 22:00:39.488 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:00:40 compute-1 nova_compute[183751]: 2026-01-27 22:00:40.002 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:00:40 compute-1 nova_compute[183751]: 2026-01-27 22:00:40.003 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:00:41 compute-1 nova_compute[183751]: 2026-01-27 22:00:41.002 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:41 compute-1 nova_compute[183751]: 2026-01-27 22:00:41.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:00:41 compute-1 nova_compute[183751]: 2026-01-27 22:00:41.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:00:47 compute-1 podman[211767]: 2026-01-27 22:00:47.828990947 +0000 UTC m=+0.128597707 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:00:49 compute-1 openstack_network_exporter[195945]: ERROR   22:00:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:00:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:00:49 compute-1 openstack_network_exporter[195945]: ERROR   22:00:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:00:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:00:50 compute-1 podman[211796]: 2026-01-27 22:00:50.745163275 +0000 UTC m=+0.055324256 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 27 22:00:50 compute-1 podman[211795]: 2026-01-27 22:00:50.752504557 +0000 UTC m=+0.065767105 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:01:00 compute-1 podman[211836]: 2026-01-27 22:01:00.770954001 +0000 UTC m=+0.075963389 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:01:01 compute-1 CROND[211861]: (root) CMD (run-parts /etc/cron.hourly)
Jan 27 22:01:01 compute-1 run-parts[211864]: (/etc/cron.hourly) starting 0anacron
Jan 27 22:01:01 compute-1 run-parts[211870]: (/etc/cron.hourly) finished 0anacron
Jan 27 22:01:01 compute-1 CROND[211860]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 27 22:01:05 compute-1 podman[193064]: time="2026-01-27T22:01:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:01:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:01:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:01:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:01:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 22:01:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:01:11.228 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:01:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:01:11.229 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:01:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:01:11.229 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:01:18 compute-1 podman[211873]: 2026-01-27 22:01:18.815784253 +0000 UTC m=+0.117611624 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 27 22:01:19 compute-1 openstack_network_exporter[195945]: ERROR   22:01:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:01:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:01:19 compute-1 openstack_network_exporter[195945]: ERROR   22:01:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:01:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:01:21 compute-1 podman[211901]: 2026-01-27 22:01:21.734847204 +0000 UTC m=+0.052912856 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 27 22:01:21 compute-1 podman[211902]: 2026-01-27 22:01:21.744102054 +0000 UTC m=+0.054324001 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:01:26 compute-1 sshd-session[211940]: Invalid user mapr from 80.94.92.186 port 42160
Jan 27 22:01:26 compute-1 sshd-session[211940]: Connection closed by invalid user mapr 80.94.92.186 port 42160 [preauth]
Jan 27 22:01:27 compute-1 nova_compute[183751]: 2026-01-27 22:01:27.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:29 compute-1 nova_compute[183751]: 2026-01-27 22:01:29.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:31 compute-1 podman[211942]: 2026-01-27 22:01:31.756233362 +0000 UTC m=+0.058804242 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:01:34 compute-1 nova_compute[183751]: 2026-01-27 22:01:34.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:35 compute-1 nova_compute[183751]: 2026-01-27 22:01:35.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:35 compute-1 podman[193064]: time="2026-01-27T22:01:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:01:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:01:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:01:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:01:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.872 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.873 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.899 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.899 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6170MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.899 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:01:38 compute-1 nova_compute[183751]: 2026-01-27 22:01:38.900 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:01:39 compute-1 nova_compute[183751]: 2026-01-27 22:01:39.948 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:01:39 compute-1 nova_compute[183751]: 2026-01-27 22:01:39.949 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:01:38 up  2:04,  0 user,  load average: 0.04, 0.01, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:01:39 compute-1 nova_compute[183751]: 2026-01-27 22:01:39.975 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:01:40 compute-1 nova_compute[183751]: 2026-01-27 22:01:40.481 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:01:40 compute-1 nova_compute[183751]: 2026-01-27 22:01:40.993 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:01:40 compute-1 nova_compute[183751]: 2026-01-27 22:01:40.993 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:01:40 compute-1 nova_compute[183751]: 2026-01-27 22:01:40.993 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:40 compute-1 nova_compute[183751]: 2026-01-27 22:01:40.993 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:01:41 compute-1 nova_compute[183751]: 2026-01-27 22:01:41.502 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:01:42 compute-1 nova_compute[183751]: 2026-01-27 22:01:42.503 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:42 compute-1 nova_compute[183751]: 2026-01-27 22:01:42.504 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:42 compute-1 nova_compute[183751]: 2026-01-27 22:01:42.505 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:42 compute-1 nova_compute[183751]: 2026-01-27 22:01:42.505 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:01:44 compute-1 nova_compute[183751]: 2026-01-27 22:01:44.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:45 compute-1 nova_compute[183751]: 2026-01-27 22:01:45.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:01:45 compute-1 nova_compute[183751]: 2026-01-27 22:01:45.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:01:47 compute-1 sshd-session[211967]: Invalid user AdminGPON from 45.148.10.121 port 36780
Jan 27 22:01:47 compute-1 sshd-session[211967]: Connection closed by invalid user AdminGPON 45.148.10.121 port 36780 [preauth]
Jan 27 22:01:49 compute-1 openstack_network_exporter[195945]: ERROR   22:01:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:01:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:01:49 compute-1 openstack_network_exporter[195945]: ERROR   22:01:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:01:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:01:49 compute-1 podman[211969]: 2026-01-27 22:01:49.819070124 +0000 UTC m=+0.119954362 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:01:52 compute-1 podman[211998]: 2026-01-27 22:01:52.773570243 +0000 UTC m=+0.071600420 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 27 22:01:52 compute-1 podman[211997]: 2026-01-27 22:01:52.774627849 +0000 UTC m=+0.080630394 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public)
Jan 27 22:02:02 compute-1 podman[212038]: 2026-01-27 22:02:02.751030357 +0000 UTC m=+0.066473302 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:02:05 compute-1 podman[193064]: time="2026-01-27T22:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:02:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:02:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 22:02:06 compute-1 nova_compute[183751]: 2026-01-27 22:02:06.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:02:11.229 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:02:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:02:11.230 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:02:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:02:11.230 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:02:19 compute-1 openstack_network_exporter[195945]: ERROR   22:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:02:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:02:19 compute-1 openstack_network_exporter[195945]: ERROR   22:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:02:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:02:20 compute-1 podman[212065]: 2026-01-27 22:02:20.804360341 +0000 UTC m=+0.114002534 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 22:02:23 compute-1 podman[212093]: 2026-01-27 22:02:23.784861538 +0000 UTC m=+0.081120747 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:02:23 compute-1 podman[212092]: 2026-01-27 22:02:23.79902684 +0000 UTC m=+0.101014031 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6)
Jan 27 22:02:29 compute-1 nova_compute[183751]: 2026-01-27 22:02:29.660 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:30 compute-1 nova_compute[183751]: 2026-01-27 22:02:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:33 compute-1 podman[212134]: 2026-01-27 22:02:33.778621257 +0000 UTC m=+0.080119312 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:02:35 compute-1 nova_compute[183751]: 2026-01-27 22:02:35.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:35 compute-1 podman[193064]: time="2026-01-27T22:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:02:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:02:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 22:02:36 compute-1 nova_compute[183751]: 2026-01-27 22:02:36.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:36 compute-1 nova_compute[183751]: 2026-01-27 22:02:36.787 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.669 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.855 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.856 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.871 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.872 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6176MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.872 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:02:38 compute-1 nova_compute[183751]: 2026-01-27 22:02:38.873 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:02:39 compute-1 nova_compute[183751]: 2026-01-27 22:02:39.953 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:02:39 compute-1 nova_compute[183751]: 2026-01-27 22:02:39.954 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:02:38 up  2:05,  0 user,  load average: 0.01, 0.01, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:02:40 compute-1 nova_compute[183751]: 2026-01-27 22:02:40.032 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:02:40 compute-1 nova_compute[183751]: 2026-01-27 22:02:40.118 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:02:40 compute-1 nova_compute[183751]: 2026-01-27 22:02:40.119 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:02:40 compute-1 nova_compute[183751]: 2026-01-27 22:02:40.133 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:02:40 compute-1 nova_compute[183751]: 2026-01-27 22:02:40.163 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:02:40 compute-1 nova_compute[183751]: 2026-01-27 22:02:40.187 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:02:40 compute-1 nova_compute[183751]: 2026-01-27 22:02:40.698 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:02:41 compute-1 nova_compute[183751]: 2026-01-27 22:02:41.210 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:02:41 compute-1 nova_compute[183751]: 2026-01-27 22:02:41.210 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.337s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:02:43 compute-1 nova_compute[183751]: 2026-01-27 22:02:43.210 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:43 compute-1 nova_compute[183751]: 2026-01-27 22:02:43.211 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:43 compute-1 nova_compute[183751]: 2026-01-27 22:02:43.211 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:02:43 compute-1 nova_compute[183751]: 2026-01-27 22:02:43.211 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:02:49 compute-1 openstack_network_exporter[195945]: ERROR   22:02:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:02:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:02:49 compute-1 openstack_network_exporter[195945]: ERROR   22:02:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:02:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:02:51 compute-1 podman[212159]: 2026-01-27 22:02:51.812553529 +0000 UTC m=+0.123035978 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Jan 27 22:02:54 compute-1 podman[212186]: 2026-01-27 22:02:54.751973687 +0000 UTC m=+0.067301084 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Jan 27 22:02:54 compute-1 podman[212187]: 2026-01-27 22:02:54.757062743 +0000 UTC m=+0.064529284 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 27 22:03:04 compute-1 podman[212225]: 2026-01-27 22:03:04.771196979 +0000 UTC m=+0.073493738 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:03:05 compute-1 podman[193064]: time="2026-01-27T22:03:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:03:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:03:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:03:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:03:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Jan 27 22:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:03:11.231 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:03:11.232 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:03:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:03:11.232 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:03:19 compute-1 openstack_network_exporter[195945]: ERROR   22:03:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:03:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:03:19 compute-1 openstack_network_exporter[195945]: ERROR   22:03:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:03:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:03:22 compute-1 podman[212250]: 2026-01-27 22:03:22.826160724 +0000 UTC m=+0.128442332 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Jan 27 22:03:25 compute-1 podman[212276]: 2026-01-27 22:03:25.779006816 +0000 UTC m=+0.083645032 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Jan 27 22:03:25 compute-1 podman[212277]: 2026-01-27 22:03:25.779265253 +0000 UTC m=+0.081388097 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 27 22:03:29 compute-1 nova_compute[183751]: 2026-01-27 22:03:29.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:30 compute-1 nova_compute[183751]: 2026-01-27 22:03:30.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:35 compute-1 podman[193064]: time="2026-01-27T22:03:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:03:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:03:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:03:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:03:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:03:35 compute-1 podman[212317]: 2026-01-27 22:03:35.782521673 +0000 UTC m=+0.082528624 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:03:36 compute-1 nova_compute[183751]: 2026-01-27 22:03:36.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:37 compute-1 nova_compute[183751]: 2026-01-27 22:03:37.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.662 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.864 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.865 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.890 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.891 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6174MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.891 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:03:40 compute-1 nova_compute[183751]: 2026-01-27 22:03:40.892 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:03:41 compute-1 nova_compute[183751]: 2026-01-27 22:03:41.953 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:03:41 compute-1 nova_compute[183751]: 2026-01-27 22:03:41.954 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:03:40 up  2:06,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:03:41 compute-1 nova_compute[183751]: 2026-01-27 22:03:41.985 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:03:42 compute-1 nova_compute[183751]: 2026-01-27 22:03:42.494 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:03:43 compute-1 nova_compute[183751]: 2026-01-27 22:03:43.010 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:03:43 compute-1 nova_compute[183751]: 2026-01-27 22:03:43.011 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:03:44 compute-1 nova_compute[183751]: 2026-01-27 22:03:44.011 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:44 compute-1 nova_compute[183751]: 2026-01-27 22:03:44.012 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:44 compute-1 nova_compute[183751]: 2026-01-27 22:03:44.012 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:44 compute-1 nova_compute[183751]: 2026-01-27 22:03:44.012 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:03:47 compute-1 nova_compute[183751]: 2026-01-27 22:03:47.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:03:49 compute-1 openstack_network_exporter[195945]: ERROR   22:03:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:03:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:03:49 compute-1 openstack_network_exporter[195945]: ERROR   22:03:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:03:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:03:53 compute-1 podman[212342]: 2026-01-27 22:03:53.79851241 +0000 UTC m=+0.108421946 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, tcib_managed=true)
Jan 27 22:03:56 compute-1 podman[212370]: 2026-01-27 22:03:56.758740194 +0000 UTC m=+0.067475361 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 22:03:56 compute-1 podman[212369]: 2026-01-27 22:03:56.781678542 +0000 UTC m=+0.082324629 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Jan 27 22:04:05 compute-1 podman[193064]: time="2026-01-27T22:04:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:04:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:04:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:04:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:04:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 22:04:06 compute-1 podman[212406]: 2026-01-27 22:04:06.740615685 +0000 UTC m=+0.055041805 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:04:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:04:11.233 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:04:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:04:11.233 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:04:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:04:11.233 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:04:19 compute-1 openstack_network_exporter[195945]: ERROR   22:04:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:04:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:04:19 compute-1 openstack_network_exporter[195945]: ERROR   22:04:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:04:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:04:24 compute-1 podman[212432]: 2026-01-27 22:04:24.780118874 +0000 UTC m=+0.090578634 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Jan 27 22:04:27 compute-1 podman[212458]: 2026-01-27 22:04:27.905646993 +0000 UTC m=+0.071832029 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 27 22:04:27 compute-1 podman[212459]: 2026-01-27 22:04:27.917252101 +0000 UTC m=+0.076184398 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 22:04:30 compute-1 nova_compute[183751]: 2026-01-27 22:04:30.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:31 compute-1 nova_compute[183751]: 2026-01-27 22:04:31.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:35 compute-1 podman[193064]: time="2026-01-27T22:04:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:04:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:04:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:04:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:04:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:04:36 compute-1 nova_compute[183751]: 2026-01-27 22:04:36.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:37 compute-1 podman[212499]: 2026-01-27 22:04:37.780021973 +0000 UTC m=+0.073288346 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:04:39 compute-1 nova_compute[183751]: 2026-01-27 22:04:39.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:41 compute-1 nova_compute[183751]: 2026-01-27 22:04:41.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.671 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.672 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.880 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.881 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.894 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.895 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6183MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.895 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:04:42 compute-1 nova_compute[183751]: 2026-01-27 22:04:42.895 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:04:43 compute-1 nova_compute[183751]: 2026-01-27 22:04:43.952 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:04:43 compute-1 nova_compute[183751]: 2026-01-27 22:04:43.953 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:04:42 up  2:07,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:04:43 compute-1 nova_compute[183751]: 2026-01-27 22:04:43.983 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:04:44 compute-1 nova_compute[183751]: 2026-01-27 22:04:44.492 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:04:45 compute-1 nova_compute[183751]: 2026-01-27 22:04:45.014 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:04:45 compute-1 nova_compute[183751]: 2026-01-27 22:04:45.015 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:04:46 compute-1 nova_compute[183751]: 2026-01-27 22:04:46.014 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:46 compute-1 nova_compute[183751]: 2026-01-27 22:04:46.015 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:04:46 compute-1 nova_compute[183751]: 2026-01-27 22:04:46.015 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:04:49 compute-1 openstack_network_exporter[195945]: ERROR   22:04:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:04:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:04:49 compute-1 openstack_network_exporter[195945]: ERROR   22:04:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:04:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:04:55 compute-1 podman[212522]: 2026-01-27 22:04:55.808466757 +0000 UTC m=+0.109754569 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Jan 27 22:04:58 compute-1 podman[212547]: 2026-01-27 22:04:58.781736974 +0000 UTC m=+0.076206828 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 22:04:58 compute-1 podman[212546]: 2026-01-27 22:04:58.794922761 +0000 UTC m=+0.098197253 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter)
Jan 27 22:05:05 compute-1 podman[193064]: time="2026-01-27T22:05:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:05:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:05:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:05:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:05:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 22:05:08 compute-1 podman[212584]: 2026-01-27 22:05:08.770530797 +0000 UTC m=+0.076340132 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:05:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:05:11.234 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:05:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:05:11.234 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:05:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:05:11.234 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:05:19 compute-1 openstack_network_exporter[195945]: ERROR   22:05:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:05:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:05:19 compute-1 openstack_network_exporter[195945]: ERROR   22:05:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:05:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:05:26 compute-1 podman[212609]: 2026-01-27 22:05:26.829193118 +0000 UTC m=+0.126464642 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 22:05:29 compute-1 podman[212635]: 2026-01-27 22:05:29.771002597 +0000 UTC m=+0.080273529 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6)
Jan 27 22:05:29 compute-1 podman[212636]: 2026-01-27 22:05:29.798133289 +0000 UTC m=+0.095357712 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 27 22:05:32 compute-1 nova_compute[183751]: 2026-01-27 22:05:32.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:33 compute-1 nova_compute[183751]: 2026-01-27 22:05:33.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:35 compute-1 podman[193064]: time="2026-01-27T22:05:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:05:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:05:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:05:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:05:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 22:05:38 compute-1 nova_compute[183751]: 2026-01-27 22:05:38.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:39 compute-1 podman[212675]: 2026-01-27 22:05:39.736525013 +0000 UTC m=+0.054413648 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:05:40 compute-1 nova_compute[183751]: 2026-01-27 22:05:40.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:41 compute-1 nova_compute[183751]: 2026-01-27 22:05:41.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.672 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.673 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.897 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.898 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.932 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.933 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6172MB free_disk=73.17802047729492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.933 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:05:43 compute-1 nova_compute[183751]: 2026-01-27 22:05:43.933 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:05:44 compute-1 nova_compute[183751]: 2026-01-27 22:05:44.983 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:05:44 compute-1 nova_compute[183751]: 2026-01-27 22:05:44.983 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:05:43 up  2:08,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:05:45 compute-1 nova_compute[183751]: 2026-01-27 22:05:45.005 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:05:45 compute-1 nova_compute[183751]: 2026-01-27 22:05:45.515 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:05:46 compute-1 nova_compute[183751]: 2026-01-27 22:05:46.024 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:05:46 compute-1 nova_compute[183751]: 2026-01-27 22:05:46.025 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:05:49 compute-1 openstack_network_exporter[195945]: ERROR   22:05:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:05:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:05:49 compute-1 openstack_network_exporter[195945]: ERROR   22:05:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:05:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:05:53 compute-1 nova_compute[183751]: 2026-01-27 22:05:53.021 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:05:57 compute-1 podman[212701]: 2026-01-27 22:05:57.834921371 +0000 UTC m=+0.145479103 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260126)
Jan 27 22:06:00 compute-1 podman[212729]: 2026-01-27 22:06:00.772506337 +0000 UTC m=+0.075142371 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:06:00 compute-1 podman[212728]: 2026-01-27 22:06:00.773793159 +0000 UTC m=+0.081439658 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 27 22:06:05 compute-1 podman[193064]: time="2026-01-27T22:06:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:06:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:06:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:06:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:06:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:06:08 compute-1 sshd-session[212767]: Invalid user latitude from 80.94.92.186 port 45198
Jan 27 22:06:08 compute-1 sshd-session[212767]: Connection closed by invalid user latitude 80.94.92.186 port 45198 [preauth]
Jan 27 22:06:10 compute-1 podman[212769]: 2026-01-27 22:06:10.741838808 +0000 UTC m=+0.052899221 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:06:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:11.236 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:06:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:11.236 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:06:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:11.236 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:06:27 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:27.075 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:06:27 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:27.075 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:06:27 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:27.078 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:06:29 compute-1 podman[212796]: 2026-01-27 22:06:29.252233135 +0000 UTC m=+0.097017983 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260126)
Jan 27 22:06:31 compute-1 podman[212825]: 2026-01-27 22:06:31.773414409 +0000 UTC m=+0.068842635 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:06:31 compute-1 podman[212824]: 2026-01-27 22:06:31.790724918 +0000 UTC m=+0.095512916 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, config_id=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 27 22:06:32 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:32.526 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:95:20 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-63b1d90c-e013-4be2-9b95-fd6ab1cbf0b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63b1d90c-e013-4be2-9b95-fd6ab1cbf0b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f70ec523177247bdb6ca1b7e476d21b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fa43581-9fba-423b-ab66-ea6feef85825, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5f955501-15d4-42fa-ad76-59aa52ba1d51) old=Port_Binding(mac=['fa:16:3e:e9:95:20'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-63b1d90c-e013-4be2-9b95-fd6ab1cbf0b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63b1d90c-e013-4be2-9b95-fd6ab1cbf0b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f70ec523177247bdb6ca1b7e476d21b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:06:32 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:32.527 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5f955501-15d4-42fa-ad76-59aa52ba1d51 in datapath 63b1d90c-e013-4be2-9b95-fd6ab1cbf0b1 updated
Jan 27 22:06:32 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:32.529 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 63b1d90c-e013-4be2-9b95-fd6ab1cbf0b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:06:32 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:32.531 105247 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpl49cilbs/privsep.sock']
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.303 105247 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.303 105247 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpl49cilbs/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.168 212869 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.174 212869 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.178 212869 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.178 212869 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212869
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.306 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[61b6c0f8-fec8-4a8a-93d1-b3272943ba2b]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.759 212869 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.759 212869 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:06:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:33.759 212869 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:06:34 compute-1 nova_compute[183751]: 2026-01-27 22:06:34.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:34 compute-1 nova_compute[183751]: 2026-01-27 22:06:34.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:34 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:34.242 212869 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 27 22:06:34 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:34.247 212869 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 27 22:06:34 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:06:34.284 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[9baa2108-6f7a-46c0-8d32-0d2926f52b19]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:06:35 compute-1 podman[193064]: time="2026-01-27T22:06:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:06:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:06:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:06:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:06:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Jan 27 22:06:39 compute-1 nova_compute[183751]: 2026-01-27 22:06:39.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:40 compute-1 nova_compute[183751]: 2026-01-27 22:06:40.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:41 compute-1 podman[212874]: 2026-01-27 22:06:41.765110465 +0000 UTC m=+0.072571348 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:06:42 compute-1 nova_compute[183751]: 2026-01-27 22:06:42.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:43 compute-1 nova_compute[183751]: 2026-01-27 22:06:43.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:43 compute-1 nova_compute[183751]: 2026-01-27 22:06:43.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.861 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.863 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.883 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.884 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6030MB free_disk=73.17789840698242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.884 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:06:44 compute-1 nova_compute[183751]: 2026-01-27 22:06:44.884 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:06:45 compute-1 nova_compute[183751]: 2026-01-27 22:06:45.942 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:06:45 compute-1 nova_compute[183751]: 2026-01-27 22:06:45.942 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:06:44 up  2:09,  0 user,  load average: 0.12, 0.03, 0.01\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:06:45 compute-1 nova_compute[183751]: 2026-01-27 22:06:45.963 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:06:46 compute-1 nova_compute[183751]: 2026-01-27 22:06:46.471 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:06:46 compute-1 nova_compute[183751]: 2026-01-27 22:06:46.981 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:06:46 compute-1 nova_compute[183751]: 2026-01-27 22:06:46.982 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:06:46 compute-1 nova_compute[183751]: 2026-01-27 22:06:46.982 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:46 compute-1 nova_compute[183751]: 2026-01-27 22:06:46.983 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:06:47 compute-1 nova_compute[183751]: 2026-01-27 22:06:47.490 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:06:48 compute-1 nova_compute[183751]: 2026-01-27 22:06:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:48 compute-1 nova_compute[183751]: 2026-01-27 22:06:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:06:48 compute-1 nova_compute[183751]: 2026-01-27 22:06:48.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:06:49 compute-1 openstack_network_exporter[195945]: ERROR   22:06:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:06:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:06:49 compute-1 openstack_network_exporter[195945]: ERROR   22:06:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:06:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:06:59 compute-1 podman[212899]: 2026-01-27 22:06:59.843339262 +0000 UTC m=+0.138164032 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 22:07:02 compute-1 podman[212926]: 2026-01-27 22:07:02.791818366 +0000 UTC m=+0.083503259 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, container_name=ovn_metadata_agent)
Jan 27 22:07:02 compute-1 podman[212925]: 2026-01-27 22:07:02.796131773 +0000 UTC m=+0.095941277 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 27 22:07:05 compute-1 podman[193064]: time="2026-01-27T22:07:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:07:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:07:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:07:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:07:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Jan 27 22:07:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:07:11.238 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:07:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:07:11.238 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:07:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:07:11.239 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:07:12 compute-1 nova_compute[183751]: 2026-01-27 22:07:12.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:12 compute-1 podman[212963]: 2026-01-27 22:07:12.766201211 +0000 UTC m=+0.070041165 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:07:19 compute-1 openstack_network_exporter[195945]: ERROR   22:07:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:07:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:07:19 compute-1 openstack_network_exporter[195945]: ERROR   22:07:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:07:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:07:21 compute-1 sshd-session[212987]: Received disconnect from 45.148.10.141 port 11226:11:  [preauth]
Jan 27 22:07:21 compute-1 sshd-session[212987]: Disconnected from authenticating user root 45.148.10.141 port 11226 [preauth]
Jan 27 22:07:30 compute-1 podman[212989]: 2026-01-27 22:07:30.817518371 +0000 UTC m=+0.123483089 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:07:33 compute-1 podman[213018]: 2026-01-27 22:07:33.74719421 +0000 UTC m=+0.050799259 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:07:33 compute-1 podman[213017]: 2026-01-27 22:07:33.764976871 +0000 UTC m=+0.060445028 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9)
Jan 27 22:07:34 compute-1 nova_compute[183751]: 2026-01-27 22:07:34.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:35 compute-1 nova_compute[183751]: 2026-01-27 22:07:35.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:35 compute-1 podman[193064]: time="2026-01-27T22:07:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:07:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:07:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:07:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:07:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:07:40 compute-1 nova_compute[183751]: 2026-01-27 22:07:40.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:41 compute-1 nova_compute[183751]: 2026-01-27 22:07:41.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:42 compute-1 nova_compute[183751]: 2026-01-27 22:07:42.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:43 compute-1 podman[213055]: 2026-01-27 22:07:43.765263337 +0000 UTC m=+0.073637264 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.875 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.876 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.899 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.900 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6062MB free_disk=73.17782211303711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.901 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:07:44 compute-1 nova_compute[183751]: 2026-01-27 22:07:44.902 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.043 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.044 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:07:44 up  2:10,  0 user,  load average: 0.04, 0.02, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.129 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.196 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.197 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.215 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.244 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.277 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:07:46 compute-1 nova_compute[183751]: 2026-01-27 22:07:46.787 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:07:47 compute-1 nova_compute[183751]: 2026-01-27 22:07:47.301 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:07:47 compute-1 nova_compute[183751]: 2026-01-27 22:07:47.301 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.399s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:07:48 compute-1 nova_compute[183751]: 2026-01-27 22:07:48.301 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:07:49 compute-1 openstack_network_exporter[195945]: ERROR   22:07:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:07:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:07:49 compute-1 openstack_network_exporter[195945]: ERROR   22:07:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:07:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:07:55 compute-1 nova_compute[183751]: 2026-01-27 22:07:55.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:01 compute-1 podman[213081]: 2026-01-27 22:08:01.784404709 +0000 UTC m=+0.092709507 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:08:04 compute-1 podman[213109]: 2026-01-27 22:08:04.75782334 +0000 UTC m=+0.065237186 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126)
Jan 27 22:08:04 compute-1 podman[213108]: 2026-01-27 22:08:04.77397889 +0000 UTC m=+0.077417537 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 27 22:08:05 compute-1 podman[193064]: time="2026-01-27T22:08:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:08:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:08:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:08:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:08:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:08:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:08:11.239 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:08:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:08:11.240 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:08:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:08:11.240 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:08:14 compute-1 podman[213150]: 2026-01-27 22:08:14.758065827 +0000 UTC m=+0.064991401 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:08:19 compute-1 openstack_network_exporter[195945]: ERROR   22:08:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:08:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:08:19 compute-1 openstack_network_exporter[195945]: ERROR   22:08:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:08:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:08:32 compute-1 podman[213174]: 2026-01-27 22:08:32.818878342 +0000 UTC m=+0.126388621 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126)
Jan 27 22:08:35 compute-1 nova_compute[183751]: 2026-01-27 22:08:35.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:35 compute-1 podman[193064]: time="2026-01-27T22:08:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:08:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:08:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:08:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:08:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 22:08:35 compute-1 podman[213200]: 2026-01-27 22:08:35.791962695 +0000 UTC m=+0.098366747 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 22:08:35 compute-1 podman[213199]: 2026-01-27 22:08:35.792693993 +0000 UTC m=+0.091213740 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, release=1755695350, version=9.6, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc.)
Jan 27 22:08:36 compute-1 nova_compute[183751]: 2026-01-27 22:08:36.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:40 compute-1 nova_compute[183751]: 2026-01-27 22:08:40.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:41 compute-1 nova_compute[183751]: 2026-01-27 22:08:41.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:44 compute-1 nova_compute[183751]: 2026-01-27 22:08:44.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:45 compute-1 podman[213241]: 2026-01-27 22:08:45.81520008 +0000 UTC m=+0.110936707 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.668 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.871 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.873 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.890 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.891 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6073MB free_disk=73.17784118652344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.891 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:08:46 compute-1 nova_compute[183751]: 2026-01-27 22:08:46.892 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:08:48 compute-1 nova_compute[183751]: 2026-01-27 22:08:48.352 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:08:48 compute-1 nova_compute[183751]: 2026-01-27 22:08:48.353 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:08:46 up  2:11,  0 user,  load average: 0.01, 0.02, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:08:48 compute-1 nova_compute[183751]: 2026-01-27 22:08:48.410 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:08:48 compute-1 nova_compute[183751]: 2026-01-27 22:08:48.917 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:08:49 compute-1 openstack_network_exporter[195945]: ERROR   22:08:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:08:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:08:49 compute-1 openstack_network_exporter[195945]: ERROR   22:08:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:08:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:08:49 compute-1 nova_compute[183751]: 2026-01-27 22:08:49.430 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:08:49 compute-1 nova_compute[183751]: 2026-01-27 22:08:49.431 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.539s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:08:50 compute-1 nova_compute[183751]: 2026-01-27 22:08:50.430 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:03 compute-1 podman[213267]: 2026-01-27 22:09:03.808847573 +0000 UTC m=+0.126312268 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 22:09:05 compute-1 podman[193064]: time="2026-01-27T22:09:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:09:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:09:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:09:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:09:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Jan 27 22:09:06 compute-1 podman[213294]: 2026-01-27 22:09:06.782240856 +0000 UTC m=+0.087662672 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 22:09:06 compute-1 podman[213293]: 2026-01-27 22:09:06.78442756 +0000 UTC m=+0.092453911 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350)
Jan 27 22:09:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:09:11.241 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:09:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:09:11.241 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:09:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:09:11.241 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:09:16 compute-1 podman[213331]: 2026-01-27 22:09:16.759783138 +0000 UTC m=+0.069234215 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:09:19 compute-1 openstack_network_exporter[195945]: ERROR   22:09:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:09:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:09:19 compute-1 openstack_network_exporter[195945]: ERROR   22:09:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:09:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:09:34 compute-1 podman[213356]: 2026-01-27 22:09:34.833274389 +0000 UTC m=+0.132639475 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 22:09:35 compute-1 nova_compute[183751]: 2026-01-27 22:09:35.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:35 compute-1 podman[193064]: time="2026-01-27T22:09:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:09:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:09:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:09:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:09:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:09:37 compute-1 podman[213383]: 2026-01-27 22:09:37.774877224 +0000 UTC m=+0.073032749 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:09:37 compute-1 podman[213382]: 2026-01-27 22:09:37.787847185 +0000 UTC m=+0.097612728 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:09:38 compute-1 nova_compute[183751]: 2026-01-27 22:09:38.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:40 compute-1 nova_compute[183751]: 2026-01-27 22:09:40.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:42 compute-1 nova_compute[183751]: 2026-01-27 22:09:42.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:46 compute-1 nova_compute[183751]: 2026-01-27 22:09:46.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:46 compute-1 nova_compute[183751]: 2026-01-27 22:09:46.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:46 compute-1 nova_compute[183751]: 2026-01-27 22:09:46.150 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.667 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:09:47 compute-1 podman[213419]: 2026-01-27 22:09:47.783792475 +0000 UTC m=+0.084605786 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.865 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.867 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.884 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.885 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6079MB free_disk=73.17784118652344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.885 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:09:47 compute-1 nova_compute[183751]: 2026-01-27 22:09:47.885 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:09:48 compute-1 nova_compute[183751]: 2026-01-27 22:09:48.966 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:09:48 compute-1 nova_compute[183751]: 2026-01-27 22:09:48.967 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:09:47 up  2:12,  0 user,  load average: 0.00, 0.01, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:09:48 compute-1 nova_compute[183751]: 2026-01-27 22:09:48.987 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:09:49 compute-1 nova_compute[183751]: 2026-01-27 22:09:49.495 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:09:49 compute-1 openstack_network_exporter[195945]: ERROR   22:09:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:09:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:09:49 compute-1 openstack_network_exporter[195945]: ERROR   22:09:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:09:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:09:50 compute-1 nova_compute[183751]: 2026-01-27 22:09:50.122 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:09:50 compute-1 nova_compute[183751]: 2026-01-27 22:09:50.123 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.238s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:09:52 compute-1 nova_compute[183751]: 2026-01-27 22:09:52.123 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:09:55 compute-1 nova_compute[183751]: 2026-01-27 22:09:55.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:05 compute-1 podman[193064]: time="2026-01-27T22:10:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:10:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:10:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:10:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:10:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2162 "" "Go-http-client/1.1"
Jan 27 22:10:05 compute-1 podman[213444]: 2026-01-27 22:10:05.79778065 +0000 UTC m=+0.105796331 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 27 22:10:08 compute-1 podman[213472]: 2026-01-27 22:10:08.764367723 +0000 UTC m=+0.067089432 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Jan 27 22:10:08 compute-1 podman[213471]: 2026-01-27 22:10:08.791519325 +0000 UTC m=+0.087427775 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 27 22:10:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:10:11.242 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:10:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:10:11.243 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:10:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:10:11.243 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:10:18 compute-1 podman[213513]: 2026-01-27 22:10:18.755179675 +0000 UTC m=+0.068212629 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:10:19 compute-1 openstack_network_exporter[195945]: ERROR   22:10:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:10:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:10:19 compute-1 openstack_network_exporter[195945]: ERROR   22:10:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:10:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:10:35 compute-1 nova_compute[183751]: 2026-01-27 22:10:35.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:35 compute-1 podman[193064]: time="2026-01-27T22:10:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:10:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:10:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:10:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:10:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:10:36 compute-1 podman[213537]: 2026-01-27 22:10:36.816033441 +0000 UTC m=+0.121448528 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:10:39 compute-1 podman[213566]: 2026-01-27 22:10:39.766058194 +0000 UTC m=+0.066602550 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 27 22:10:39 compute-1 podman[213567]: 2026-01-27 22:10:39.792381316 +0000 UTC m=+0.084012021 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 22:10:39 compute-1 sshd-session[213564]: Invalid user latitude from 80.94.92.186 port 48218
Jan 27 22:10:40 compute-1 nova_compute[183751]: 2026-01-27 22:10:40.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:40 compute-1 sshd-session[213564]: Connection closed by invalid user latitude 80.94.92.186 port 48218 [preauth]
Jan 27 22:10:41 compute-1 nova_compute[183751]: 2026-01-27 22:10:41.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:44 compute-1 nova_compute[183751]: 2026-01-27 22:10:44.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:46 compute-1 nova_compute[183751]: 2026-01-27 22:10:46.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:47 compute-1 nova_compute[183751]: 2026-01-27 22:10:47.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:47 compute-1 nova_compute[183751]: 2026-01-27 22:10:47.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:47 compute-1 nova_compute[183751]: 2026-01-27 22:10:47.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.857 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.858 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.882 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.882 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6078MB free_disk=73.17784118652344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.883 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:10:48 compute-1 nova_compute[183751]: 2026-01-27 22:10:48.883 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:10:49 compute-1 openstack_network_exporter[195945]: ERROR   22:10:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:10:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:10:49 compute-1 openstack_network_exporter[195945]: ERROR   22:10:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:10:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:10:49 compute-1 podman[213609]: 2026-01-27 22:10:49.778395612 +0000 UTC m=+0.077624053 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:10:50 compute-1 nova_compute[183751]: 2026-01-27 22:10:50.010 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:10:50 compute-1 nova_compute[183751]: 2026-01-27 22:10:50.010 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:10:48 up  2:13,  0 user,  load average: 0.00, 0.01, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:10:50 compute-1 nova_compute[183751]: 2026-01-27 22:10:50.047 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:10:50 compute-1 nova_compute[183751]: 2026-01-27 22:10:50.556 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:10:51 compute-1 nova_compute[183751]: 2026-01-27 22:10:51.122 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:10:51 compute-1 nova_compute[183751]: 2026-01-27 22:10:51.122 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.239s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:10:53 compute-1 nova_compute[183751]: 2026-01-27 22:10:53.124 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:05 compute-1 podman[193064]: time="2026-01-27T22:11:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:11:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:11:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:11:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:11:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:11:07 compute-1 podman[213635]: 2026-01-27 22:11:07.813759807 +0000 UTC m=+0.122888034 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126)
Jan 27 22:11:10 compute-1 podman[213662]: 2026-01-27 22:11:10.766035865 +0000 UTC m=+0.071703476 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 27 22:11:10 compute-1 podman[213663]: 2026-01-27 22:11:10.809963823 +0000 UTC m=+0.106863097 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 22:11:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:11:11.244 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:11:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:11:11.244 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:11:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:11:11.244 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:11:19 compute-1 openstack_network_exporter[195945]: ERROR   22:11:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:11:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:11:19 compute-1 openstack_network_exporter[195945]: ERROR   22:11:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:11:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:11:20 compute-1 podman[213704]: 2026-01-27 22:11:20.799321609 +0000 UTC m=+0.104550870 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:11:35 compute-1 podman[193064]: time="2026-01-27T22:11:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:11:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:11:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:11:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:11:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:11:37 compute-1 nova_compute[183751]: 2026-01-27 22:11:37.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:38 compute-1 podman[213728]: 2026-01-27 22:11:38.794152502 +0000 UTC m=+0.105900283 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 27 22:11:41 compute-1 podman[213756]: 2026-01-27 22:11:41.777328908 +0000 UTC m=+0.075552401 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:11:41 compute-1 podman[213755]: 2026-01-27 22:11:41.7874944 +0000 UTC m=+0.083783055 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, config_id=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Jan 27 22:11:42 compute-1 nova_compute[183751]: 2026-01-27 22:11:42.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:42 compute-1 nova_compute[183751]: 2026-01-27 22:11:42.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:44 compute-1 nova_compute[183751]: 2026-01-27 22:11:44.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:47 compute-1 nova_compute[183751]: 2026-01-27 22:11:47.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.858 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.859 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.891 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.893 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6078MB free_disk=73.1774673461914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.893 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:11:48 compute-1 nova_compute[183751]: 2026-01-27 22:11:48.894 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:11:49 compute-1 openstack_network_exporter[195945]: ERROR   22:11:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:11:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:11:49 compute-1 openstack_network_exporter[195945]: ERROR   22:11:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:11:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:11:49 compute-1 nova_compute[183751]: 2026-01-27 22:11:49.949 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:11:49 compute-1 nova_compute[183751]: 2026-01-27 22:11:49.949 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:11:48 up  2:14,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:11:49 compute-1 nova_compute[183751]: 2026-01-27 22:11:49.981 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:11:50 compute-1 nova_compute[183751]: 2026-01-27 22:11:50.490 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:11:51 compute-1 nova_compute[183751]: 2026-01-27 22:11:51.000 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:11:51 compute-1 nova_compute[183751]: 2026-01-27 22:11:51.001 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:11:51 compute-1 nova_compute[183751]: 2026-01-27 22:11:51.002 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:51 compute-1 nova_compute[183751]: 2026-01-27 22:11:51.002 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:11:51 compute-1 nova_compute[183751]: 2026-01-27 22:11:51.509 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:11:51 compute-1 podman[213794]: 2026-01-27 22:11:51.781347505 +0000 UTC m=+0.082820660 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:11:52 compute-1 nova_compute[183751]: 2026-01-27 22:11:52.511 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:52 compute-1 nova_compute[183751]: 2026-01-27 22:11:52.511 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:52 compute-1 nova_compute[183751]: 2026-01-27 22:11:52.512 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:11:53 compute-1 nova_compute[183751]: 2026-01-27 22:11:53.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:11:53 compute-1 nova_compute[183751]: 2026-01-27 22:11:53.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:11:58 compute-1 podman[193064]: time="2026-01-27T22:11:58Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:11:58 compute-1 podman[193064]: @ - - [27/Jan/2026:22:11:58 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 15563 "" "Go-http-client/1.1"
Jan 27 22:12:00 compute-1 nova_compute[183751]: 2026-01-27 22:12:00.651 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:05 compute-1 podman[193064]: time="2026-01-27T22:12:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:12:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:12:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:12:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:12:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 22:12:09 compute-1 podman[213819]: 2026-01-27 22:12:09.848352459 +0000 UTC m=+0.152093366 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 22:12:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:12:11.245 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:12:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:12:11.246 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:12:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:12:11.246 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:12:12 compute-1 podman[213846]: 2026-01-27 22:12:12.772034759 +0000 UTC m=+0.079998451 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350)
Jan 27 22:12:12 compute-1 podman[213847]: 2026-01-27 22:12:12.782756025 +0000 UTC m=+0.078324790 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:12:15 compute-1 nova_compute[183751]: 2026-01-27 22:12:15.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:19 compute-1 openstack_network_exporter[195945]: ERROR   22:12:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:12:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:12:19 compute-1 openstack_network_exporter[195945]: ERROR   22:12:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:12:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:12:22 compute-1 podman[213884]: 2026-01-27 22:12:22.78634122 +0000 UTC m=+0.081931609 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:12:33 compute-1 nova_compute[183751]: 2026-01-27 22:12:33.806 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:35 compute-1 podman[193064]: time="2026-01-27T22:12:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:12:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:12:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:12:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:12:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:12:39 compute-1 nova_compute[183751]: 2026-01-27 22:12:39.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:40 compute-1 podman[213909]: 2026-01-27 22:12:40.786842808 +0000 UTC m=+0.097108664 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:12:40 compute-1 nova_compute[183751]: 2026-01-27 22:12:40.787 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:43 compute-1 podman[213935]: 2026-01-27 22:12:43.786542538 +0000 UTC m=+0.078295649 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 27 22:12:43 compute-1 podman[213936]: 2026-01-27 22:12:43.793252605 +0000 UTC m=+0.074677960 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Jan 27 22:12:44 compute-1 nova_compute[183751]: 2026-01-27 22:12:44.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:44 compute-1 nova_compute[183751]: 2026-01-27 22:12:44.657 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:44 compute-1 nova_compute[183751]: 2026-01-27 22:12:44.657 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.669 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.670 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.670 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.670 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.903 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.905 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.929 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.930 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6079MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.931 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:12:48 compute-1 nova_compute[183751]: 2026-01-27 22:12:48.931 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:12:49 compute-1 openstack_network_exporter[195945]: ERROR   22:12:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:12:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:12:49 compute-1 openstack_network_exporter[195945]: ERROR   22:12:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:12:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:12:49 compute-1 nova_compute[183751]: 2026-01-27 22:12:49.996 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:12:49 compute-1 nova_compute[183751]: 2026-01-27 22:12:49.996 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:12:48 up  2:15,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:12:50 compute-1 nova_compute[183751]: 2026-01-27 22:12:50.030 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:12:50 compute-1 nova_compute[183751]: 2026-01-27 22:12:50.050 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:12:50 compute-1 nova_compute[183751]: 2026-01-27 22:12:50.051 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:12:50 compute-1 nova_compute[183751]: 2026-01-27 22:12:50.079 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:12:50 compute-1 nova_compute[183751]: 2026-01-27 22:12:50.113 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:12:50 compute-1 nova_compute[183751]: 2026-01-27 22:12:50.142 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:12:50 compute-1 nova_compute[183751]: 2026-01-27 22:12:50.651 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:12:51 compute-1 nova_compute[183751]: 2026-01-27 22:12:51.164 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:12:51 compute-1 nova_compute[183751]: 2026-01-27 22:12:51.164 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.233s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:12:52 compute-1 nova_compute[183751]: 2026-01-27 22:12:52.163 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:52 compute-1 nova_compute[183751]: 2026-01-27 22:12:52.164 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:12:53 compute-1 nova_compute[183751]: 2026-01-27 22:12:53.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:12:53 compute-1 podman[213978]: 2026-01-27 22:12:53.779053108 +0000 UTC m=+0.079260843 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:13:05 compute-1 podman[193064]: time="2026-01-27T22:13:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:13:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:13:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:13:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:13:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 22:13:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:13:11.247 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:13:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:13:11.247 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:13:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:13:11.248 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:13:11 compute-1 podman[214003]: 2026-01-27 22:13:11.819425184 +0000 UTC m=+0.125395145 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 27 22:13:14 compute-1 podman[214030]: 2026-01-27 22:13:14.775749021 +0000 UTC m=+0.077673404 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 27 22:13:14 compute-1 podman[214029]: 2026-01-27 22:13:14.791320816 +0000 UTC m=+0.098181121 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:13:19 compute-1 openstack_network_exporter[195945]: ERROR   22:13:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:13:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:13:19 compute-1 openstack_network_exporter[195945]: ERROR   22:13:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:13:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:13:24 compute-1 podman[214069]: 2026-01-27 22:13:24.773336069 +0000 UTC m=+0.079131689 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:13:35 compute-1 podman[193064]: time="2026-01-27T22:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:13:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:13:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 22:13:41 compute-1 nova_compute[183751]: 2026-01-27 22:13:41.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:42 compute-1 podman[214093]: 2026-01-27 22:13:42.829221189 +0000 UTC m=+0.129980317 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Jan 27 22:13:44 compute-1 nova_compute[183751]: 2026-01-27 22:13:44.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:45 compute-1 nova_compute[183751]: 2026-01-27 22:13:45.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:45 compute-1 podman[214121]: 2026-01-27 22:13:45.77398431 +0000 UTC m=+0.069024309 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 27 22:13:45 compute-1 podman[214120]: 2026-01-27 22:13:45.805586662 +0000 UTC m=+0.101174165 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Jan 27 22:13:46 compute-1 nova_compute[183751]: 2026-01-27 22:13:46.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:49 compute-1 openstack_network_exporter[195945]: ERROR   22:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:13:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:13:49 compute-1 openstack_network_exporter[195945]: ERROR   22:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:13:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.667 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.862 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.863 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.887 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.888 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6077MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.888 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:13:49 compute-1 nova_compute[183751]: 2026-01-27 22:13:49.889 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:13:51 compute-1 nova_compute[183751]: 2026-01-27 22:13:51.007 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:13:51 compute-1 nova_compute[183751]: 2026-01-27 22:13:51.007 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:13:49 up  2:16,  0 user,  load average: 0.00, 0.00, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:13:51 compute-1 nova_compute[183751]: 2026-01-27 22:13:51.081 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:13:51 compute-1 nova_compute[183751]: 2026-01-27 22:13:51.590 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:13:52 compute-1 nova_compute[183751]: 2026-01-27 22:13:52.103 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:13:52 compute-1 nova_compute[183751]: 2026-01-27 22:13:52.104 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.215s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:13:53 compute-1 nova_compute[183751]: 2026-01-27 22:13:53.106 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:55 compute-1 nova_compute[183751]: 2026-01-27 22:13:55.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:13:55 compute-1 podman[214162]: 2026-01-27 22:13:55.781456501 +0000 UTC m=+0.083803856 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:13:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:13:56.702 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:13:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:13:56.704 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:13:58 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:13:58.707 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:14:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:00.235 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:9a:0b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-85d74afe-a45f-4876-a8e1-bb8adb73fa0c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85d74afe-a45f-4876-a8e1-bb8adb73fa0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1486088be91644128586b2bff5eafeed', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b28eae22-3cd4-423a-93b5-6ed6cee5bf6e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3715083f-d013-467a-8d6b-b0e6bb924f34) old=Port_Binding(mac=['fa:16:3e:4e:9a:0b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-85d74afe-a45f-4876-a8e1-bb8adb73fa0c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85d74afe-a45f-4876-a8e1-bb8adb73fa0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1486088be91644128586b2bff5eafeed', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:14:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:00.236 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3715083f-d013-467a-8d6b-b0e6bb924f34 in datapath 85d74afe-a45f-4876-a8e1-bb8adb73fa0c updated
Jan 27 22:14:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:00.237 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85d74afe-a45f-4876-a8e1-bb8adb73fa0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:14:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:00.239 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7214dfda-ebcf-4385-a0ba-5af4aabae06e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:14:02 compute-1 nova_compute[183751]: 2026-01-27 22:14:02.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:05 compute-1 podman[193064]: time="2026-01-27T22:14:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:14:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:14:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:14:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:14:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:14:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:09.575 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:f2:9e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4ef8a602-1628-4681-991f-cc7417825325', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ef8a602-1628-4681-991f-cc7417825325', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '563a2a2371bb43b4825c174fd22dcfb3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ab65195-0e0b-4d55-887b-ee3bb69d4cb3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0604674c-dadb-48fd-933b-b5c52cd28e9a) old=Port_Binding(mac=['fa:16:3e:c6:f2:9e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4ef8a602-1628-4681-991f-cc7417825325', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ef8a602-1628-4681-991f-cc7417825325', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '563a2a2371bb43b4825c174fd22dcfb3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:14:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:09.577 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0604674c-dadb-48fd-933b-b5c52cd28e9a in datapath 4ef8a602-1628-4681-991f-cc7417825325 updated
Jan 27 22:14:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:09.578 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ef8a602-1628-4681-991f-cc7417825325, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:14:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:09.580 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c48dc8fa-aed6-433e-8fa0-767d8c00a0b1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:14:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:11.249 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:14:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:11.249 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:14:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:14:11.250 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:14:13 compute-1 podman[214188]: 2026-01-27 22:14:13.84179844 +0000 UTC m=+0.142553519 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 27 22:14:16 compute-1 podman[214215]: 2026-01-27 22:14:16.758146667 +0000 UTC m=+0.065901583 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 27 22:14:16 compute-1 podman[214214]: 2026-01-27 22:14:16.758525576 +0000 UTC m=+0.071732156 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64)
Jan 27 22:14:19 compute-1 openstack_network_exporter[195945]: ERROR   22:14:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:14:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:14:19 compute-1 openstack_network_exporter[195945]: ERROR   22:14:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:14:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:14:26 compute-1 podman[214255]: 2026-01-27 22:14:26.751675213 +0000 UTC m=+0.060160370 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:14:35 compute-1 podman[193064]: time="2026-01-27T22:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:14:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:14:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 22:14:39 compute-1 sshd-session[214280]: Received disconnect from 91.224.92.108 port 32980:11:  [preauth]
Jan 27 22:14:39 compute-1 sshd-session[214280]: Disconnected from authenticating user root 91.224.92.108 port 32980 [preauth]
Jan 27 22:14:43 compute-1 nova_compute[183751]: 2026-01-27 22:14:43.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:44 compute-1 podman[214282]: 2026-01-27 22:14:44.823621681 +0000 UTC m=+0.124446072 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 22:14:45 compute-1 nova_compute[183751]: 2026-01-27 22:14:45.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:46 compute-1 nova_compute[183751]: 2026-01-27 22:14:46.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:46 compute-1 nova_compute[183751]: 2026-01-27 22:14:46.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:47 compute-1 podman[214308]: 2026-01-27 22:14:47.803995443 +0000 UTC m=+0.100453698 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Jan 27 22:14:47 compute-1 podman[214309]: 2026-01-27 22:14:47.834850647 +0000 UTC m=+0.120389851 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:49 compute-1 openstack_network_exporter[195945]: ERROR   22:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:14:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:14:49 compute-1 openstack_network_exporter[195945]: ERROR   22:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:14:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.668 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.669 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.889 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.890 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.911 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.912 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6073MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.912 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:14:49 compute-1 nova_compute[183751]: 2026-01-27 22:14:49.912 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:14:50 compute-1 nova_compute[183751]: 2026-01-27 22:14:50.976 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:14:50 compute-1 nova_compute[183751]: 2026-01-27 22:14:50.976 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:14:49 up  2:17,  0 user,  load average: 0.07, 0.02, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:14:51 compute-1 nova_compute[183751]: 2026-01-27 22:14:51.007 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:14:51 compute-1 nova_compute[183751]: 2026-01-27 22:14:51.515 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:14:52 compute-1 nova_compute[183751]: 2026-01-27 22:14:52.025 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:14:52 compute-1 nova_compute[183751]: 2026-01-27 22:14:52.025 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:14:53 compute-1 nova_compute[183751]: 2026-01-27 22:14:53.025 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:53 compute-1 nova_compute[183751]: 2026-01-27 22:14:53.026 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:53 compute-1 nova_compute[183751]: 2026-01-27 22:14:53.026 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:14:55 compute-1 nova_compute[183751]: 2026-01-27 22:14:55.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:14:57 compute-1 podman[214349]: 2026-01-27 22:14:57.768088781 +0000 UTC m=+0.072066415 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:15:05 compute-1 podman[193064]: time="2026-01-27T22:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:15:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:15:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 22:15:06 compute-1 sshd-session[214373]: Invalid user lighthouse from 80.94.92.186 port 51262
Jan 27 22:15:07 compute-1 sshd-session[214373]: Connection closed by invalid user lighthouse 80.94.92.186 port 51262 [preauth]
Jan 27 22:15:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:15:11.251 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:15:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:15:11.252 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:15:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:15:11.252 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:15:15 compute-1 podman[214376]: 2026-01-27 22:15:15.835190568 +0000 UTC m=+0.139875982 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20260126)
Jan 27 22:15:18 compute-1 podman[214403]: 2026-01-27 22:15:18.798034408 +0000 UTC m=+0.091725412 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:15:18 compute-1 podman[214402]: 2026-01-27 22:15:18.80541497 +0000 UTC m=+0.108229930 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Jan 27 22:15:19 compute-1 openstack_network_exporter[195945]: ERROR   22:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:15:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:15:19 compute-1 openstack_network_exporter[195945]: ERROR   22:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:15:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:15:28 compute-1 podman[214438]: 2026-01-27 22:15:28.777141539 +0000 UTC m=+0.078502144 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:15:35 compute-1 podman[193064]: time="2026-01-27T22:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:15:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:15:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 22:15:45 compute-1 nova_compute[183751]: 2026-01-27 22:15:45.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:46 compute-1 nova_compute[183751]: 2026-01-27 22:15:46.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:46 compute-1 nova_compute[183751]: 2026-01-27 22:15:46.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:46 compute-1 podman[214463]: 2026-01-27 22:15:46.832249028 +0000 UTC m=+0.128278386 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 27 22:15:48 compute-1 nova_compute[183751]: 2026-01-27 22:15:48.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:49 compute-1 openstack_network_exporter[195945]: ERROR   22:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:15:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:15:49 compute-1 openstack_network_exporter[195945]: ERROR   22:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:15:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:15:49 compute-1 podman[214489]: 2026-01-27 22:15:49.786988756 +0000 UTC m=+0.093893766 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 27 22:15:49 compute-1 podman[214490]: 2026-01-27 22:15:49.799400863 +0000 UTC m=+0.100010327 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.850 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.851 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.886 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.887 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6071MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.887 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:15:50 compute-1 nova_compute[183751]: 2026-01-27 22:15:50.887 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:15:52 compute-1 nova_compute[183751]: 2026-01-27 22:15:52.137 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:15:52 compute-1 nova_compute[183751]: 2026-01-27 22:15:52.137 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:15:50 up  2:18,  0 user,  load average: 0.02, 0.01, 0.00\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:15:52 compute-1 nova_compute[183751]: 2026-01-27 22:15:52.168 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:15:52 compute-1 nova_compute[183751]: 2026-01-27 22:15:52.678 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:15:53 compute-1 nova_compute[183751]: 2026-01-27 22:15:53.189 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:15:53 compute-1 nova_compute[183751]: 2026-01-27 22:15:53.190 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.302s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:15:54 compute-1 nova_compute[183751]: 2026-01-27 22:15:54.190 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:54 compute-1 nova_compute[183751]: 2026-01-27 22:15:54.190 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:54 compute-1 nova_compute[183751]: 2026-01-27 22:15:54.191 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:15:57 compute-1 nova_compute[183751]: 2026-01-27 22:15:57.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:15:59 compute-1 podman[214530]: 2026-01-27 22:15:59.774659937 +0000 UTC m=+0.086640155 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:16:05 compute-1 podman[193064]: time="2026-01-27T22:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:16:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:16:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2167 "" "Go-http-client/1.1"
Jan 27 22:16:07 compute-1 nova_compute[183751]: 2026-01-27 22:16:07.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:16:11.253 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:16:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:16:11.254 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:16:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:16:11.254 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:16:17 compute-1 podman[214555]: 2026-01-27 22:16:17.802413171 +0000 UTC m=+0.108142278 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 22:16:19 compute-1 openstack_network_exporter[195945]: ERROR   22:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:16:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:16:19 compute-1 openstack_network_exporter[195945]: ERROR   22:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:16:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:16:20 compute-1 podman[214582]: 2026-01-27 22:16:20.75397625 +0000 UTC m=+0.067754548 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:16:20 compute-1 podman[214581]: 2026-01-27 22:16:20.777966664 +0000 UTC m=+0.084373900 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6)
Jan 27 22:16:30 compute-1 podman[214620]: 2026-01-27 22:16:30.766569388 +0000 UTC m=+0.075509930 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:16:35 compute-1 podman[193064]: time="2026-01-27T22:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:16:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:16:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 22:16:46 compute-1 nova_compute[183751]: 2026-01-27 22:16:46.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:48 compute-1 nova_compute[183751]: 2026-01-27 22:16:48.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:48 compute-1 nova_compute[183751]: 2026-01-27 22:16:48.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:48 compute-1 podman[214644]: 2026-01-27 22:16:48.810003022 +0000 UTC m=+0.114669759 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:16:49 compute-1 nova_compute[183751]: 2026-01-27 22:16:49.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:49 compute-1 openstack_network_exporter[195945]: ERROR   22:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:16:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:16:49 compute-1 openstack_network_exporter[195945]: ERROR   22:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:16:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:16:51 compute-1 nova_compute[183751]: 2026-01-27 22:16:51.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:51 compute-1 nova_compute[183751]: 2026-01-27 22:16:51.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:51 compute-1 nova_compute[183751]: 2026-01-27 22:16:51.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:16:51 compute-1 nova_compute[183751]: 2026-01-27 22:16:51.662 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:16:51 compute-1 podman[214672]: 2026-01-27 22:16:51.758163077 +0000 UTC m=+0.064115828 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:16:51 compute-1 podman[214671]: 2026-01-27 22:16:51.764269068 +0000 UTC m=+0.071819519 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41)
Jan 27 22:16:52 compute-1 nova_compute[183751]: 2026-01-27 22:16:52.663 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.376 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.377 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.377 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.377 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.579 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.580 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.617 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.618 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6084MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.619 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:16:54 compute-1 nova_compute[183751]: 2026-01-27 22:16:54.619 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:16:55 compute-1 nova_compute[183751]: 2026-01-27 22:16:55.782 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:16:55 compute-1 nova_compute[183751]: 2026-01-27 22:16:55.782 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:16:54 up  2:19,  0 user,  load average: 0.17, 0.04, 0.01\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:16:55 compute-1 nova_compute[183751]: 2026-01-27 22:16:55.832 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:16:56 compute-1 nova_compute[183751]: 2026-01-27 22:16:56.339 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:16:56 compute-1 nova_compute[183751]: 2026-01-27 22:16:56.857 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:16:56 compute-1 nova_compute[183751]: 2026-01-27 22:16:56.857 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.238s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:16:58 compute-1 nova_compute[183751]: 2026-01-27 22:16:58.342 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:58 compute-1 nova_compute[183751]: 2026-01-27 22:16:58.342 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:16:58 compute-1 nova_compute[183751]: 2026-01-27 22:16:58.342 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:17:01 compute-1 podman[214710]: 2026-01-27 22:17:01.771202616 +0000 UTC m=+0.076662399 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:17:03 compute-1 nova_compute[183751]: 2026-01-27 22:17:03.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:03 compute-1 nova_compute[183751]: 2026-01-27 22:17:03.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:17:05 compute-1 podman[193064]: time="2026-01-27T22:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:17:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:17:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:17:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:17:11.255 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:17:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:17:11.255 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:17:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:17:11.255 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:17:19 compute-1 openstack_network_exporter[195945]: ERROR   22:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:17:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:17:19 compute-1 openstack_network_exporter[195945]: ERROR   22:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:17:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:17:19 compute-1 podman[214736]: 2026-01-27 22:17:19.588592104 +0000 UTC m=+0.124198096 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 27 22:17:22 compute-1 nova_compute[183751]: 2026-01-27 22:17:22.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:22 compute-1 podman[214763]: 2026-01-27 22:17:22.76385639 +0000 UTC m=+0.077128250 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:17:22 compute-1 podman[214762]: 2026-01-27 22:17:22.78323322 +0000 UTC m=+0.092842539 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 27 22:17:32 compute-1 podman[214801]: 2026-01-27 22:17:32.779156234 +0000 UTC m=+0.086834500 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:17:35 compute-1 podman[193064]: time="2026-01-27T22:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:17:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:17:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2164 "" "Go-http-client/1.1"
Jan 27 22:17:46 compute-1 nova_compute[183751]: 2026-01-27 22:17:46.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:48 compute-1 nova_compute[183751]: 2026-01-27 22:17:48.146 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:49 compute-1 nova_compute[183751]: 2026-01-27 22:17:49.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:49 compute-1 nova_compute[183751]: 2026-01-27 22:17:49.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:49 compute-1 openstack_network_exporter[195945]: ERROR   22:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:17:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:17:49 compute-1 openstack_network_exporter[195945]: ERROR   22:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:17:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:17:49 compute-1 podman[214825]: 2026-01-27 22:17:49.820406096 +0000 UTC m=+0.121540899 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:17:51 compute-1 nova_compute[183751]: 2026-01-27 22:17:51.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.661 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.662 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.662 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.663 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.881 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.882 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.898 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.899 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6076MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.899 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:17:52 compute-1 nova_compute[183751]: 2026-01-27 22:17:52.899 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:17:53 compute-1 podman[214853]: 2026-01-27 22:17:53.814250295 +0000 UTC m=+0.106136148 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 27 22:17:53 compute-1 podman[214854]: 2026-01-27 22:17:53.814555032 +0000 UTC m=+0.102803405 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:17:53 compute-1 nova_compute[183751]: 2026-01-27 22:17:53.954 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:17:53 compute-1 nova_compute[183751]: 2026-01-27 22:17:53.955 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:17:52 up  2:20,  0 user,  load average: 0.06, 0.03, 0.01\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:17:53 compute-1 nova_compute[183751]: 2026-01-27 22:17:53.984 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:17:54 compute-1 nova_compute[183751]: 2026-01-27 22:17:54.002 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:17:54 compute-1 nova_compute[183751]: 2026-01-27 22:17:54.003 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:17:54 compute-1 nova_compute[183751]: 2026-01-27 22:17:54.025 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:17:54 compute-1 nova_compute[183751]: 2026-01-27 22:17:54.059 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:17:54 compute-1 nova_compute[183751]: 2026-01-27 22:17:54.083 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:17:54 compute-1 nova_compute[183751]: 2026-01-27 22:17:54.592 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:17:55 compute-1 nova_compute[183751]: 2026-01-27 22:17:55.105 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:17:55 compute-1 nova_compute[183751]: 2026-01-27 22:17:55.106 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.206s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:17:58 compute-1 nova_compute[183751]: 2026-01-27 22:17:58.108 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:17:58 compute-1 nova_compute[183751]: 2026-01-27 22:17:58.108 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:17:58 compute-1 nova_compute[183751]: 2026-01-27 22:17:58.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:03 compute-1 podman[214892]: 2026-01-27 22:18:03.745774207 +0000 UTC m=+0.062448447 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:18:05 compute-1 podman[193064]: time="2026-01-27T22:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:18:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:18:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2165 "" "Go-http-client/1.1"
Jan 27 22:18:09 compute-1 nova_compute[183751]: 2026-01-27 22:18:09.146 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:18:11.256 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:18:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:18:11.256 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:18:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:18:11.257 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:18:19 compute-1 openstack_network_exporter[195945]: ERROR   22:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:18:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:18:19 compute-1 openstack_network_exporter[195945]: ERROR   22:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:18:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:18:20 compute-1 podman[214917]: 2026-01-27 22:18:20.814070771 +0000 UTC m=+0.115536181 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 27 22:18:25 compute-1 podman[214945]: 2026-01-27 22:18:25.015015265 +0000 UTC m=+0.076104044 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 27 22:18:25 compute-1 podman[214944]: 2026-01-27 22:18:25.021542157 +0000 UTC m=+0.087344163 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Jan 27 22:18:34 compute-1 podman[214980]: 2026-01-27 22:18:34.775670996 +0000 UTC m=+0.079005277 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:18:35 compute-1 podman[193064]: time="2026-01-27T22:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:18:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:18:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2168 "" "Go-http-client/1.1"
Jan 27 22:18:48 compute-1 nova_compute[183751]: 2026-01-27 22:18:48.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:49 compute-1 nova_compute[183751]: 2026-01-27 22:18:49.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:49 compute-1 openstack_network_exporter[195945]: ERROR   22:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:18:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:18:49 compute-1 openstack_network_exporter[195945]: ERROR   22:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:18:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:18:50 compute-1 nova_compute[183751]: 2026-01-27 22:18:50.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:51 compute-1 nova_compute[183751]: 2026-01-27 22:18:51.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:51 compute-1 podman[215004]: 2026-01-27 22:18:51.863805164 +0000 UTC m=+0.162773990 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.667 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.906 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.908 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.933 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.934 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6071MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.934 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:18:52 compute-1 nova_compute[183751]: 2026-01-27 22:18:52.935 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:18:54 compute-1 nova_compute[183751]: 2026-01-27 22:18:54.070 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:18:54 compute-1 nova_compute[183751]: 2026-01-27 22:18:54.071 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:18:52 up  2:21,  0 user,  load average: 0.60, 0.18, 0.05\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:18:54 compute-1 nova_compute[183751]: 2026-01-27 22:18:54.098 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:18:54 compute-1 nova_compute[183751]: 2026-01-27 22:18:54.608 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:18:55 compute-1 nova_compute[183751]: 2026-01-27 22:18:55.122 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:18:55 compute-1 nova_compute[183751]: 2026-01-27 22:18:55.123 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.188s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:18:55 compute-1 podman[215031]: 2026-01-27 22:18:55.790015887 +0000 UTC m=+0.092043949 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:18:55 compute-1 podman[215032]: 2026-01-27 22:18:55.790093099 +0000 UTC m=+0.094672694 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 27 22:19:00 compute-1 nova_compute[183751]: 2026-01-27 22:19:00.124 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:00 compute-1 nova_compute[183751]: 2026-01-27 22:19:00.124 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:00 compute-1 nova_compute[183751]: 2026-01-27 22:19:00.125 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:19:05 compute-1 podman[193064]: time="2026-01-27T22:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:19:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:19:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Jan 27 22:19:05 compute-1 podman[215068]: 2026-01-27 22:19:05.76525117 +0000 UTC m=+0.074712690 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:19:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:11.258 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:19:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:11.258 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:19:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:11.258 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:19:19 compute-1 openstack_network_exporter[195945]: ERROR   22:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:19:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:19:19 compute-1 openstack_network_exporter[195945]: ERROR   22:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:19:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:19:22 compute-1 podman[215092]: 2026-01-27 22:19:22.860422419 +0000 UTC m=+0.165607630 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 22:19:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:22.890 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:19:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:22.890 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:19:26 compute-1 podman[215122]: 2026-01-27 22:19:26.775879805 +0000 UTC m=+0.081668832 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126)
Jan 27 22:19:26 compute-1 podman[215121]: 2026-01-27 22:19:26.77729046 +0000 UTC m=+0.086639765 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, architecture=x86_64)
Jan 27 22:19:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:29.892 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:19:34 compute-1 sshd-session[215162]: Invalid user lighthouse from 80.94.92.186 port 54302
Jan 27 22:19:34 compute-1 sshd-session[215162]: Connection closed by invalid user lighthouse 80.94.92.186 port 54302 [preauth]
Jan 27 22:19:35 compute-1 podman[193064]: time="2026-01-27T22:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:19:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:19:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2163 "" "Go-http-client/1.1"
Jan 27 22:19:36 compute-1 podman[215164]: 2026-01-27 22:19:36.779146505 +0000 UTC m=+0.079663733 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:19:40 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:40.261 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:20:b1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003e25ef-7f26-49e0-80b8-c9281daaa995', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3ad1e42a07f4ac1a56d65a2f5f430d2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54db84df-1bc7-40a2-98c5-ed19faa4a30e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ffc119d7-8c66-43c4-841c-f7703d759400) old=Port_Binding(mac=['fa:16:3e:af:20:b1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003e25ef-7f26-49e0-80b8-c9281daaa995', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3ad1e42a07f4ac1a56d65a2f5f430d2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:19:40 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:40.262 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ffc119d7-8c66-43c4-841c-f7703d759400 in datapath 003e25ef-7f26-49e0-80b8-c9281daaa995 updated
Jan 27 22:19:40 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:40.262 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 003e25ef-7f26-49e0-80b8-c9281daaa995, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:19:40 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:40.264 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ad583ae4-50e9-495f-b84c-8812fe94740c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:19:48 compute-1 nova_compute[183751]: 2026-01-27 22:19:48.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:49.249 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:40:ee 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f3f2a1ab-1846-463c-924e-2fbac41be95d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f2a1ab-1846-463c-924e-2fbac41be95d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbb53a6bca6547238f0b67bd582b0ea5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bde6747e-9cb0-4ea7-a39d-0c9caf3620b7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=851de326-9dff-462c-9999-64efff0a72e2) old=Port_Binding(mac=['fa:16:3e:c7:40:ee'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f3f2a1ab-1846-463c-924e-2fbac41be95d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3f2a1ab-1846-463c-924e-2fbac41be95d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbb53a6bca6547238f0b67bd582b0ea5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:19:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:49.250 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 851de326-9dff-462c-9999-64efff0a72e2 in datapath f3f2a1ab-1846-463c-924e-2fbac41be95d updated
Jan 27 22:19:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:49.251 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f3f2a1ab-1846-463c-924e-2fbac41be95d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:19:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:19:49.252 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[70bbe3d8-69e4-4004-bbd3-b07e3875a11f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:19:49 compute-1 openstack_network_exporter[195945]: ERROR   22:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:19:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:19:49 compute-1 openstack_network_exporter[195945]: ERROR   22:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:19:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:19:50 compute-1 nova_compute[183751]: 2026-01-27 22:19:50.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.860 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.861 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.878 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.879 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6075MB free_disk=73.17728424072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.879 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:19:52 compute-1 nova_compute[183751]: 2026-01-27 22:19:52.879 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:19:53 compute-1 podman[215191]: 2026-01-27 22:19:53.80874727 +0000 UTC m=+0.113427149 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126, tcib_managed=true, config_id=ovn_controller)
Jan 27 22:19:53 compute-1 nova_compute[183751]: 2026-01-27 22:19:53.937 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:19:53 compute-1 nova_compute[183751]: 2026-01-27 22:19:53.938 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:19:52 up  2:22,  0 user,  load average: 0.35, 0.18, 0.06\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:19:54 compute-1 nova_compute[183751]: 2026-01-27 22:19:54.008 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:19:54 compute-1 nova_compute[183751]: 2026-01-27 22:19:54.515 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:19:55 compute-1 nova_compute[183751]: 2026-01-27 22:19:55.029 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:19:55 compute-1 nova_compute[183751]: 2026-01-27 22:19:55.030 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:19:56 compute-1 nova_compute[183751]: 2026-01-27 22:19:56.033 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:56 compute-1 nova_compute[183751]: 2026-01-27 22:19:56.034 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:57 compute-1 nova_compute[183751]: 2026-01-27 22:19:57.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:19:57 compute-1 nova_compute[183751]: 2026-01-27 22:19:57.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:19:57 compute-1 podman[215218]: 2026-01-27 22:19:57.778215174 +0000 UTC m=+0.084499583 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, io.buildah.version=1.41.4)
Jan 27 22:19:57 compute-1 podman[215217]: 2026-01-27 22:19:57.789172355 +0000 UTC m=+0.091610398 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:19:59 compute-1 nova_compute[183751]: 2026-01-27 22:19:59.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:05 compute-1 podman[193064]: time="2026-01-27T22:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:20:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:20:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2166 "" "Go-http-client/1.1"
Jan 27 22:20:07 compute-1 podman[215257]: 2026-01-27 22:20:07.787858708 +0000 UTC m=+0.089142177 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:20:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:11.259 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:11.259 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:11.260 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:14 compute-1 nova_compute[183751]: 2026-01-27 22:20:14.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:19 compute-1 openstack_network_exporter[195945]: ERROR   22:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:20:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:20:19 compute-1 openstack_network_exporter[195945]: ERROR   22:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:20:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:20:24 compute-1 podman[215282]: 2026-01-27 22:20:24.833354106 +0000 UTC m=+0.133038114 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:20:26 compute-1 nova_compute[183751]: 2026-01-27 22:20:26.500 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:26 compute-1 nova_compute[183751]: 2026-01-27 22:20:26.501 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:27 compute-1 nova_compute[183751]: 2026-01-27 22:20:27.008 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:20:27 compute-1 nova_compute[183751]: 2026-01-27 22:20:27.624 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:27 compute-1 nova_compute[183751]: 2026-01-27 22:20:27.625 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:27 compute-1 nova_compute[183751]: 2026-01-27 22:20:27.635 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:20:27 compute-1 nova_compute[183751]: 2026-01-27 22:20:27.635 183755 INFO nova.compute.claims [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:20:28 compute-1 nova_compute[183751]: 2026-01-27 22:20:28.757 183755 DEBUG nova.compute.provider_tree [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:20:28 compute-1 podman[215310]: 2026-01-27 22:20:28.792325902 +0000 UTC m=+0.087328093 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 22:20:28 compute-1 podman[215309]: 2026-01-27 22:20:28.813370763 +0000 UTC m=+0.117185652 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64)
Jan 27 22:20:29 compute-1 nova_compute[183751]: 2026-01-27 22:20:29.267 183755 DEBUG nova.scheduler.client.report [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:20:29 compute-1 nova_compute[183751]: 2026-01-27 22:20:29.780 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.155s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:29 compute-1 nova_compute[183751]: 2026-01-27 22:20:29.782 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:20:30 compute-1 nova_compute[183751]: 2026-01-27 22:20:30.295 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:20:30 compute-1 nova_compute[183751]: 2026-01-27 22:20:30.296 183755 DEBUG nova.network.neutron [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:20:30 compute-1 nova_compute[183751]: 2026-01-27 22:20:30.297 183755 WARNING neutronclient.v2_0.client [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:20:30 compute-1 nova_compute[183751]: 2026-01-27 22:20:30.299 183755 WARNING neutronclient.v2_0.client [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:20:30 compute-1 nova_compute[183751]: 2026-01-27 22:20:30.807 183755 INFO nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:20:31 compute-1 nova_compute[183751]: 2026-01-27 22:20:31.318 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.350 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.353 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.354 183755 INFO nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Creating image(s)
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.356 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "/var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.356 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "/var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.358 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "/var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.359 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.360 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:32 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:32.515 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:20:32 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:32.516 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:20:32 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:32.517 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:32 compute-1 nova_compute[183751]: 2026-01-27 22:20:32.761 183755 DEBUG nova.network.neutron [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Successfully created port: 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.737 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.743 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.744 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.823 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.part --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.825 183755 DEBUG nova.virt.images [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] 46eb297a-0b7d-41f9-8336-a7ae35b5797e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.828 183755 DEBUG nova.privsep.utils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.829 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.part /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:33 compute-1 nova_compute[183751]: 2026-01-27 22:20:33.908 183755 DEBUG nova.network.neutron [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Successfully updated port: 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.211 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.part /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.converted" returned: 0 in 0.382s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.219 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.291 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd.converted --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.293 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.933s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.294 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.301 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.303 183755 INFO oslo.privsep.daemon [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpwy3h76gx/privsep.sock']
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.332 183755 DEBUG nova.compute.manager [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-changed-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.332 183755 DEBUG nova.compute.manager [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Refreshing instance network info cache due to event network-changed-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.333 183755 DEBUG oslo_concurrency.lockutils [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-6382ecba-36e3-4f7d-81c9-3951a047b87f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.333 183755 DEBUG oslo_concurrency.lockutils [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-6382ecba-36e3-4f7d-81c9-3951a047b87f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.334 183755 DEBUG nova.network.neutron [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Refreshing network info cache for port 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.431 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "refresh_cache-6382ecba-36e3-4f7d-81c9-3951a047b87f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:20:34 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.841 183755 WARNING neutronclient.v2_0.client [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.132 183755 INFO oslo.privsep.daemon [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Spawned new privsep daemon via rootwrap
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.957 215366 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.962 215366 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.964 215366 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:34.964 215366 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215366
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.140 183755 DEBUG nova.network.neutron [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.230 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.279 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.281 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.282 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.283 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.289 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.289 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.336 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.338 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.368 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.370 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.371 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.432 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.433 183755 DEBUG nova.virt.disk.api [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Checking if we can resize image /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.433 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.490 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.492 183755 DEBUG nova.virt.disk.api [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Cannot resize image /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.493 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.493 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Ensure instance console log exists: /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.494 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.494 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.495 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:35 compute-1 nova_compute[183751]: 2026-01-27 22:20:35.612 183755 DEBUG nova.network.neutron [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:20:35 compute-1 podman[193064]: time="2026-01-27T22:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:20:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:20:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Jan 27 22:20:36 compute-1 nova_compute[183751]: 2026-01-27 22:20:36.122 183755 DEBUG oslo_concurrency.lockutils [req-68985156-2f9d-44fc-8bb4-a62ff4df0c0b req-8a9eb524-a92e-47b4-80a9-868934ee16f5 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-6382ecba-36e3-4f7d-81c9-3951a047b87f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:20:36 compute-1 nova_compute[183751]: 2026-01-27 22:20:36.123 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquired lock "refresh_cache-6382ecba-36e3-4f7d-81c9-3951a047b87f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:20:36 compute-1 nova_compute[183751]: 2026-01-27 22:20:36.124 183755 DEBUG nova.network.neutron [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:20:37 compute-1 nova_compute[183751]: 2026-01-27 22:20:37.491 183755 DEBUG nova.network.neutron [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:20:38 compute-1 nova_compute[183751]: 2026-01-27 22:20:38.303 183755 WARNING neutronclient.v2_0.client [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:20:38 compute-1 nova_compute[183751]: 2026-01-27 22:20:38.490 183755 DEBUG nova.network.neutron [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Updating instance_info_cache with network_info: [{"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:20:38 compute-1 podman[215383]: 2026-01-27 22:20:38.792751358 +0000 UTC m=+0.091084825 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:20:38 compute-1 nova_compute[183751]: 2026-01-27 22:20:38.997 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Releasing lock "refresh_cache-6382ecba-36e3-4f7d-81c9-3951a047b87f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:20:38 compute-1 nova_compute[183751]: 2026-01-27 22:20:38.998 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Instance network_info: |[{"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.001 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Start _get_guest_xml network_info=[{"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.007 183755 WARNING nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.010 183755 DEBUG nova.virt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-456505752', uuid='6382ecba-36e3-4f7d-81c9-3951a047b87f'), owner=OwnerMeta(userid='fcaa4f96208c43aa80de7acda91b9da8', username='tempest-TestDataModel-1034755695-project-admin', projectid='dbb53a6bca6547238f0b67bd582b0ea5', projectname='tempest-TestDataModel-1034755695'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769552439.0097485) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.014 183755 DEBUG nova.virt.libvirt.host [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.015 183755 DEBUG nova.virt.libvirt.host [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.018 183755 DEBUG nova.virt.libvirt.host [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.018 183755 DEBUG nova.virt.libvirt.host [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.020 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.020 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.020 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.020 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.021 183755 DEBUG nova.virt.hardware [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.025 183755 DEBUG nova.privsep.utils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.026 183755 DEBUG nova.virt.libvirt.vif [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:20:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-456505752',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-456505752',id=2,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dbb53a6bca6547238f0b67bd582b0ea5',ramdisk_id='',reservation_id='r-9ujhy045',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,member,reader',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1034755695',owner_user_name='tempest-TestDataModel-1034755695-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:20:31Z,user_data=None,user_id='fcaa4f96208c43aa80de7acda91b9da8',uuid=6382ecba-36e3-4f7d-81c9-3951a047b87f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.027 183755 DEBUG nova.network.os_vif_util [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Converting VIF {"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.027 183755 DEBUG nova.network.os_vif_util [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:5a:fa,bridge_name='br-int',has_traffic_filtering=True,id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff,network=Network(003e25ef-7f26-49e0-80b8-c9281daaa995),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c1ce5da-50') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.029 183755 DEBUG nova.objects.instance [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6382ecba-36e3-4f7d-81c9-3951a047b87f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.537 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <uuid>6382ecba-36e3-4f7d-81c9-3951a047b87f</uuid>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <name>instance-00000002</name>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:name>tempest-TestDataModel-server-456505752</nova:name>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:20:39</nova:creationTime>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:20:39 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:20:39 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:user uuid="fcaa4f96208c43aa80de7acda91b9da8">tempest-TestDataModel-1034755695-project-admin</nova:user>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:project uuid="dbb53a6bca6547238f0b67bd582b0ea5">tempest-TestDataModel-1034755695</nova:project>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         <nova:port uuid="2c1ce5da-5066-473b-8e3b-41b4f5ec32ff">
Jan 27 22:20:39 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <system>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <entry name="serial">6382ecba-36e3-4f7d-81c9-3951a047b87f</entry>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <entry name="uuid">6382ecba-36e3-4f7d-81c9-3951a047b87f</entry>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </system>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <os>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   </os>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <features>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   </features>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk.config"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:d0:5a:fa"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <target dev="tap2c1ce5da-50"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/console.log" append="off"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <video>
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </video>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:20:39 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:20:39 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:20:39 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:20:39 compute-1 nova_compute[183751]: </domain>
Jan 27 22:20:39 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.539 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Preparing to wait for external event network-vif-plugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.540 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.540 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.541 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.542 183755 DEBUG nova.virt.libvirt.vif [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:20:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-456505752',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-456505752',id=2,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dbb53a6bca6547238f0b67bd582b0ea5',ramdisk_id='',reservation_id='r-9ujhy045',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,member,reader',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-1034755695',owner_user_name='tempest-TestDataModel-1034755695-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:20:31Z,user_data=None,user_id='fcaa4f96208c43aa80de7acda91b9da8',uuid=6382ecba-36e3-4f7d-81c9-3951a047b87f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.542 183755 DEBUG nova.network.os_vif_util [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Converting VIF {"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.543 183755 DEBUG nova.network.os_vif_util [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:5a:fa,bridge_name='br-int',has_traffic_filtering=True,id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff,network=Network(003e25ef-7f26-49e0-80b8-c9281daaa995),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c1ce5da-50') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.544 183755 DEBUG os_vif [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:5a:fa,bridge_name='br-int',has_traffic_filtering=True,id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff,network=Network(003e25ef-7f26-49e0-80b8-c9281daaa995),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c1ce5da-50') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.615 183755 DEBUG ovsdbapp.backend.ovs_idl [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.616 183755 DEBUG ovsdbapp.backend.ovs_idl [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.616 183755 DEBUG ovsdbapp.backend.ovs_idl [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.617 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.619 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.619 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.620 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.623 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.627 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.639 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.640 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.641 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.642 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.643 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7b17a545-8211-5fd7-9385-5ff7d38d2755', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.645 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.648 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:20:39 compute-1 nova_compute[183751]: 2026-01-27 22:20:39.650 183755 INFO oslo.privsep.daemon [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp6_xy8bhd/privsep.sock']
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.459 183755 INFO oslo.privsep.daemon [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Spawned new privsep daemon via rootwrap
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.281 215411 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.288 215411 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.292 215411 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.292 215411 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215411
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.719 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.719 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c1ce5da-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.720 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2c1ce5da-50, col_values=(('qos', UUID('908d55b2-661e-43b8-8ac0-484cf985cc48')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.721 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2c1ce5da-50, col_values=(('external_ids', {'iface-id': '2c1ce5da-5066-473b-8e3b-41b4f5ec32ff', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:5a:fa', 'vm-uuid': '6382ecba-36e3-4f7d-81c9-3951a047b87f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.723 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:40 compute-1 NetworkManager[56069]: <info>  [1769552440.7247] manager: (tap2c1ce5da-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.726 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.732 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.733 183755 INFO os_vif [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:5a:fa,bridge_name='br-int',has_traffic_filtering=True,id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff,network=Network(003e25ef-7f26-49e0-80b8-c9281daaa995),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c1ce5da-50')
Jan 27 22:20:40 compute-1 nova_compute[183751]: 2026-01-27 22:20:40.766 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:42 compute-1 nova_compute[183751]: 2026-01-27 22:20:42.279 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:20:42 compute-1 nova_compute[183751]: 2026-01-27 22:20:42.279 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:20:42 compute-1 nova_compute[183751]: 2026-01-27 22:20:42.280 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] No VIF found with MAC fa:16:3e:d0:5a:fa, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:20:42 compute-1 nova_compute[183751]: 2026-01-27 22:20:42.281 183755 INFO nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Using config drive
Jan 27 22:20:42 compute-1 nova_compute[183751]: 2026-01-27 22:20:42.794 183755 WARNING neutronclient.v2_0.client [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:20:43 compute-1 nova_compute[183751]: 2026-01-27 22:20:43.424 183755 INFO nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Creating config drive at /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk.config
Jan 27 22:20:43 compute-1 nova_compute[183751]: 2026-01-27 22:20:43.430 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpwrnqhffa execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:43 compute-1 nova_compute[183751]: 2026-01-27 22:20:43.568 183755 DEBUG oslo_concurrency.processutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpwrnqhffa" returned: 0 in 0.138s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:43 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 27 22:20:43 compute-1 kernel: tap2c1ce5da-50: entered promiscuous mode
Jan 27 22:20:43 compute-1 NetworkManager[56069]: <info>  [1769552443.6786] manager: (tap2c1ce5da-50): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Jan 27 22:20:43 compute-1 ovn_controller[95969]: 2026-01-27T22:20:43Z|00040|binding|INFO|Claiming lport 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff for this chassis.
Jan 27 22:20:43 compute-1 ovn_controller[95969]: 2026-01-27T22:20:43Z|00041|binding|INFO|2c1ce5da-5066-473b-8e3b-41b4f5ec32ff: Claiming fa:16:3e:d0:5a:fa 10.100.0.10
Jan 27 22:20:43 compute-1 nova_compute[183751]: 2026-01-27 22:20:43.681 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:43 compute-1 nova_compute[183751]: 2026-01-27 22:20:43.684 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.706 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:5a:fa 10.100.0.10'], port_security=['fa:16:3e:d0:5a:fa 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6382ecba-36e3-4f7d-81c9-3951a047b87f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003e25ef-7f26-49e0-80b8-c9281daaa995', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbb53a6bca6547238f0b67bd582b0ea5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2d904869-4d42-4d7e-a2a8-3489bee49b82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54db84df-1bc7-40a2-98c5-ed19faa4a30e, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.706 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff in datapath 003e25ef-7f26-49e0-80b8-c9281daaa995 bound to our chassis
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.708 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 003e25ef-7f26-49e0-80b8-c9281daaa995
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.736 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c957c632-8ba5-4f4f-8cbb-89cea7689297]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.737 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap003e25ef-71 in ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.741 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap003e25ef-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.741 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b8995526-dff9-41ff-8b05-565fddcb4241]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:43 compute-1 systemd-udevd[215442]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.742 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6a8b2f-7d35-42f3-8e4e-9bffd778fa72]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:43 compute-1 systemd-machined[155034]: New machine qemu-1-instance-00000002.
Jan 27 22:20:43 compute-1 nova_compute[183751]: 2026-01-27 22:20:43.757 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:43 compute-1 ovn_controller[95969]: 2026-01-27T22:20:43Z|00042|binding|INFO|Setting lport 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff ovn-installed in OVS
Jan 27 22:20:43 compute-1 ovn_controller[95969]: 2026-01-27T22:20:43Z|00043|binding|INFO|Setting lport 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff up in Southbound
Jan 27 22:20:43 compute-1 NetworkManager[56069]: <info>  [1769552443.7632] device (tap2c1ce5da-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:20:43 compute-1 nova_compute[183751]: 2026-01-27 22:20:43.762 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:43 compute-1 NetworkManager[56069]: <info>  [1769552443.7639] device (tap2c1ce5da-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:20:43 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.768 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbb648a-63f6-43ad-bccc-65f63e06f8f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.777 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[71297081-202a-4313-85b2-8c405c94faee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:43.779 105247 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpapi9_x7l/privsep.sock']
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.026 183755 DEBUG nova.compute.manager [req-1a272091-5d11-4ad6-9f60-fd456df9d727 req-c6a53183-7365-46ec-946f-b520da7a695a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-vif-plugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.026 183755 DEBUG oslo_concurrency.lockutils [req-1a272091-5d11-4ad6-9f60-fd456df9d727 req-c6a53183-7365-46ec-946f-b520da7a695a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.027 183755 DEBUG oslo_concurrency.lockutils [req-1a272091-5d11-4ad6-9f60-fd456df9d727 req-c6a53183-7365-46ec-946f-b520da7a695a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.027 183755 DEBUG oslo_concurrency.lockutils [req-1a272091-5d11-4ad6-9f60-fd456df9d727 req-c6a53183-7365-46ec-946f-b520da7a695a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.027 183755 DEBUG nova.compute.manager [req-1a272091-5d11-4ad6-9f60-fd456df9d727 req-c6a53183-7365-46ec-946f-b520da7a695a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Processing event network-vif-plugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.441 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.462 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.468 183755 INFO nova.virt.libvirt.driver [-] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Instance spawned successfully.
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.469 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:20:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:44.587 105247 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 27 22:20:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:44.588 105247 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpapi9_x7l/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Jan 27 22:20:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:44.420 215469 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 27 22:20:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:44.424 215469 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 27 22:20:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:44.426 215469 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 27 22:20:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:44.426 215469 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215469
Jan 27 22:20:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:44.590 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[ab455380-2bd3-4ca7-96bc-a41d2c1d3ad0]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.986 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.988 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.989 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.990 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.991 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:20:44 compute-1 nova_compute[183751]: 2026-01-27 22:20:44.992 183755 DEBUG nova.virt.libvirt.driver [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.052 215469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.052 215469 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.052 215469 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.482 215469 INFO oslo_service.backend [-] Loading backend: eventlet
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.487 215469 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Jan 27 22:20:45 compute-1 nova_compute[183751]: 2026-01-27 22:20:45.535 183755 INFO nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Took 13.18 seconds to spawn the instance on the hypervisor.
Jan 27 22:20:45 compute-1 nova_compute[183751]: 2026-01-27 22:20:45.537 183755 DEBUG nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.568 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[2765f45e-ee24-4c5f-a289-04fc3c833316]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 NetworkManager[56069]: <info>  [1769552445.5917] manager: (tap003e25ef-70): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.590 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[bf353a70-77df-431e-a5a2-b4962b60a0a4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 systemd-udevd[215440]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.643 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[4897acce-1aa6-477f-be00-9b06090ded2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.647 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[213c6e6c-f82a-4713-a1ad-559607a08b52]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 NetworkManager[56069]: <info>  [1769552445.6824] device (tap003e25ef-70): carrier: link connected
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.691 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[9b5d6925-03fc-4c71-90e9-edc2ddcb2a6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.719 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9071d2-d237-4ddb-92ab-11e7b33b2a9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap003e25ef-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:20:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858794, 'reachable_time': 15489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215495, 'error': None, 'target': 'ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 nova_compute[183751]: 2026-01-27 22:20:45.723 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.746 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[42abaa15-4a73-4d06-9785-849cd544c27e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:20b1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 858794, 'tstamp': 858794}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215496, 'error': None, 'target': 'ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 nova_compute[183751]: 2026-01-27 22:20:45.769 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.771 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b703114d-078b-46db-84c7-cea2c0daccad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap003e25ef-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:20:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858794, 'reachable_time': 15489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215497, 'error': None, 'target': 'ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.814 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3326fa94-0751-415f-9400-bda6b1252687]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.884 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[27dfac2c-cc8c-45ee-8385-f42b20e39d82]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.885 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap003e25ef-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.885 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.886 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap003e25ef-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:45 compute-1 NetworkManager[56069]: <info>  [1769552445.8893] manager: (tap003e25ef-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 27 22:20:45 compute-1 kernel: tap003e25ef-70: entered promiscuous mode
Jan 27 22:20:45 compute-1 nova_compute[183751]: 2026-01-27 22:20:45.894 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.897 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap003e25ef-70, col_values=(('external_ids', {'iface-id': 'ffc119d7-8c66-43c4-841c-f7703d759400'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:45 compute-1 ovn_controller[95969]: 2026-01-27T22:20:45Z|00044|binding|INFO|Releasing lport ffc119d7-8c66-43c4-841c-f7703d759400 from this chassis (sb_readonly=0)
Jan 27 22:20:45 compute-1 nova_compute[183751]: 2026-01-27 22:20:45.898 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.902 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[be0e51c4-d6dd-4703-8cc8-198303b1bc24]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.903 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.904 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.904 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 003e25ef-7f26-49e0-80b8-c9281daaa995 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.904 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.905 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7e33347f-c8de-4d56-8a93-fe7e989ab259]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.906 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.907 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c18e1bb6-34a1-4708-88b3-e54c6c3c3ad3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.908 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-003e25ef-7f26-49e0-80b8-c9281daaa995
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID 003e25ef-7f26-49e0-80b8-c9281daaa995
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:20:45 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:45.913 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995', 'env', 'PROCESS_TAG=haproxy-003e25ef-7f26-49e0-80b8-c9281daaa995', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/003e25ef-7f26-49e0-80b8-c9281daaa995.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:20:45 compute-1 nova_compute[183751]: 2026-01-27 22:20:45.912 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.068 183755 INFO nova.compute.manager [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Took 18.55 seconds to build instance.
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.094 183755 DEBUG nova.compute.manager [req-c2991ebe-fce6-4f9e-821a-5c28c0a0dc6c req-5c687d68-ba3c-4194-80bc-1794c50e18a0 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-vif-plugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.094 183755 DEBUG oslo_concurrency.lockutils [req-c2991ebe-fce6-4f9e-821a-5c28c0a0dc6c req-5c687d68-ba3c-4194-80bc-1794c50e18a0 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.094 183755 DEBUG oslo_concurrency.lockutils [req-c2991ebe-fce6-4f9e-821a-5c28c0a0dc6c req-5c687d68-ba3c-4194-80bc-1794c50e18a0 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.094 183755 DEBUG oslo_concurrency.lockutils [req-c2991ebe-fce6-4f9e-821a-5c28c0a0dc6c req-5c687d68-ba3c-4194-80bc-1794c50e18a0 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.094 183755 DEBUG nova.compute.manager [req-c2991ebe-fce6-4f9e-821a-5c28c0a0dc6c req-5c687d68-ba3c-4194-80bc-1794c50e18a0 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] No waiting events found dispatching network-vif-plugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.095 183755 WARNING nova.compute.manager [req-c2991ebe-fce6-4f9e-821a-5c28c0a0dc6c req-5c687d68-ba3c-4194-80bc-1794c50e18a0 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received unexpected event network-vif-plugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff for instance with vm_state active and task_state None.
Jan 27 22:20:46 compute-1 podman[215528]: 2026-01-27 22:20:46.355046575 +0000 UTC m=+0.074724641 container create c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 22:20:46 compute-1 podman[215528]: 2026-01-27 22:20:46.312212444 +0000 UTC m=+0.031890570 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:20:46 compute-1 systemd[1]: Started libpod-conmon-c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc.scope.
Jan 27 22:20:46 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:20:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa1b7835a364c068485a09c181a1349287519807e011a810a2c1c382103edc81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:20:46 compute-1 podman[215528]: 2026-01-27 22:20:46.478207383 +0000 UTC m=+0.197885489 container init c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:20:46 compute-1 podman[215528]: 2026-01-27 22:20:46.48656117 +0000 UTC m=+0.206239266 container start c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:20:46 compute-1 neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995[215543]: [NOTICE]   (215547) : New worker (215549) forked
Jan 27 22:20:46 compute-1 neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995[215543]: [NOTICE]   (215547) : Loading success.
Jan 27 22:20:46 compute-1 nova_compute[183751]: 2026-01-27 22:20:46.578 183755 DEBUG oslo_concurrency.lockutils [None req-95ea2023-b4d8-4e1e-b374-590da03804f5 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.077s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.202 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.203 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.203 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.204 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.204 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.219 183755 INFO nova.compute.manager [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Terminating instance
Jan 27 22:20:49 compute-1 openstack_network_exporter[195945]: ERROR   22:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:20:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:20:49 compute-1 openstack_network_exporter[195945]: ERROR   22:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:20:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.738 183755 DEBUG nova.compute.manager [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:20:49 compute-1 kernel: tap2c1ce5da-50 (unregistering): left promiscuous mode
Jan 27 22:20:49 compute-1 NetworkManager[56069]: <info>  [1769552449.7605] device (tap2c1ce5da-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:20:49 compute-1 ovn_controller[95969]: 2026-01-27T22:20:49Z|00045|binding|INFO|Releasing lport 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff from this chassis (sb_readonly=0)
Jan 27 22:20:49 compute-1 ovn_controller[95969]: 2026-01-27T22:20:49Z|00046|binding|INFO|Setting lport 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff down in Southbound
Jan 27 22:20:49 compute-1 ovn_controller[95969]: 2026-01-27T22:20:49Z|00047|binding|INFO|Removing iface tap2c1ce5da-50 ovn-installed in OVS
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.768 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:49.775 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:5a:fa 10.100.0.10'], port_security=['fa:16:3e:d0:5a:fa 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6382ecba-36e3-4f7d-81c9-3951a047b87f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003e25ef-7f26-49e0-80b8-c9281daaa995', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dbb53a6bca6547238f0b67bd582b0ea5', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2d904869-4d42-4d7e-a2a8-3489bee49b82', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54db84df-1bc7-40a2-98c5-ed19faa4a30e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:20:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:49.776 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff in datapath 003e25ef-7f26-49e0-80b8-c9281daaa995 unbound from our chassis
Jan 27 22:20:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:49.778 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 003e25ef-7f26-49e0-80b8-c9281daaa995, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:20:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:49.779 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[e8453476-aff2-4ce7-b8fe-3347a23f744b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:49.782 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995 namespace which is not needed anymore
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.787 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:49 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 27 22:20:49 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 6.098s CPU time.
Jan 27 22:20:49 compute-1 systemd-machined[155034]: Machine qemu-1-instance-00000002 terminated.
Jan 27 22:20:49 compute-1 neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995[215543]: [NOTICE]   (215547) : haproxy version is 3.0.5-8e879a5
Jan 27 22:20:49 compute-1 neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995[215543]: [NOTICE]   (215547) : path to executable is /usr/sbin/haproxy
Jan 27 22:20:49 compute-1 neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995[215543]: [WARNING]  (215547) : Exiting Master process...
Jan 27 22:20:49 compute-1 podman[215583]: 2026-01-27 22:20:49.909323852 +0000 UTC m=+0.035885919 container kill c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 27 22:20:49 compute-1 neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995[215543]: [ALERT]    (215547) : Current worker (215549) exited with code 143 (Terminated)
Jan 27 22:20:49 compute-1 neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995[215543]: [WARNING]  (215547) : All workers exited. Exiting... (0)
Jan 27 22:20:49 compute-1 systemd[1]: libpod-c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc.scope: Deactivated successfully.
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.963 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:49 compute-1 nova_compute[183751]: 2026-01-27 22:20:49.969 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:49 compute-1 podman[215600]: 2026-01-27 22:20:49.976832233 +0000 UTC m=+0.041258552 container died c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:20:50 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc-userdata-shm.mount: Deactivated successfully.
Jan 27 22:20:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-aa1b7835a364c068485a09c181a1349287519807e011a810a2c1c382103edc81-merged.mount: Deactivated successfully.
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.019 183755 INFO nova.virt.libvirt.driver [-] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Instance destroyed successfully.
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.019 183755 DEBUG nova.objects.instance [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lazy-loading 'resources' on Instance uuid 6382ecba-36e3-4f7d-81c9-3951a047b87f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:20:50 compute-1 podman[215600]: 2026-01-27 22:20:50.031462066 +0000 UTC m=+0.095888285 container cleanup c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 22:20:50 compute-1 systemd[1]: libpod-conmon-c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc.scope: Deactivated successfully.
Jan 27 22:20:50 compute-1 podman[215601]: 2026-01-27 22:20:50.050275481 +0000 UTC m=+0.110990758 container remove c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.063 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[58f468ab-15e1-4cb0-b878-033f5ce99fd3]: (4, ("Tue Jan 27 10:20:49 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995 (c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc)\nc3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc\nTue Jan 27 10:20:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995 (c3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc)\nc3a1a13d3e76e05fc1d820ea474f0fd59d3ec5218d3ff73039d3a88823ce4cdc\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.065 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[cf8c6323-fd6c-490e-9de5-9d70c80764ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.065 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/003e25ef-7f26-49e0-80b8-c9281daaa995.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.066 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[91805f3f-5b32-40bc-8769-9d8940e7ecbf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.067 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap003e25ef-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.069 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 kernel: tap003e25ef-70: left promiscuous mode
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.094 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.094 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.096 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[08f0abe5-d267-4f3b-b29b-576a7b7f8bb5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.112 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[55b244a9-a585-415b-b386-85288f3cefee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.113 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd87038-91ee-4f26-8a15-60ecd2d90b64]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.131 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[17abde8e-083a-4701-afb8-c4b40dcebff8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 858782, 'reachable_time': 21490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215647, 'error': None, 'target': 'ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.137 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-003e25ef-7f26-49e0-80b8-c9281daaa995 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:20:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:20:50.137 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f24b4b-b9e6-4ca7-99d1-5f0e7fdc3bb6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:20:50 compute-1 systemd[1]: run-netns-ovnmeta\x2d003e25ef\x2d7f26\x2d49e0\x2d80b8\x2dc9281daaa995.mount: Deactivated successfully.
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.526 183755 DEBUG nova.virt.libvirt.vif [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:20:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-456505752',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testdatamodel-server-456505752',id=2,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:20:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dbb53a6bca6547238f0b67bd582b0ea5',ramdisk_id='',reservation_id='r-9ujhy045',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,member,reader',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-1034755695',owner_user_name='tempest-TestDataModel-1034755695-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:20:45Z,user_data=None,user_id='fcaa4f96208c43aa80de7acda91b9da8',uuid=6382ecba-36e3-4f7d-81c9-3951a047b87f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.527 183755 DEBUG nova.network.os_vif_util [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Converting VIF {"id": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "address": "fa:16:3e:d0:5a:fa", "network": {"id": "003e25ef-7f26-49e0-80b8-c9281daaa995", "bridge": "br-int", "label": "tempest-TestDataModel-1766331649-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3ad1e42a07f4ac1a56d65a2f5f430d2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c1ce5da-50", "ovs_interfaceid": "2c1ce5da-5066-473b-8e3b-41b4f5ec32ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.527 183755 DEBUG nova.network.os_vif_util [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:5a:fa,bridge_name='br-int',has_traffic_filtering=True,id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff,network=Network(003e25ef-7f26-49e0-80b8-c9281daaa995),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c1ce5da-50') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.528 183755 DEBUG os_vif [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:5a:fa,bridge_name='br-int',has_traffic_filtering=True,id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff,network=Network(003e25ef-7f26-49e0-80b8-c9281daaa995),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c1ce5da-50') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.530 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.531 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c1ce5da-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.533 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.534 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.535 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.535 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=908d55b2-661e-43b8-8ac0-484cf985cc48) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.536 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.536 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.540 183755 INFO os_vif [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:5a:fa,bridge_name='br-int',has_traffic_filtering=True,id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff,network=Network(003e25ef-7f26-49e0-80b8-c9281daaa995),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c1ce5da-50')
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.541 183755 INFO nova.virt.libvirt.driver [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Deleting instance files /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f_del
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.542 183755 INFO nova.virt.libvirt.driver [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Deletion of /var/lib/nova/instances/6382ecba-36e3-4f7d-81c9-3951a047b87f_del complete
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.708 183755 DEBUG nova.compute.manager [req-15e22c3c-e1e3-48c9-b211-d8907ce92362 req-10e1bf2f-823c-429b-b409-71343300ce3b 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-vif-unplugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.709 183755 DEBUG oslo_concurrency.lockutils [req-15e22c3c-e1e3-48c9-b211-d8907ce92362 req-10e1bf2f-823c-429b-b409-71343300ce3b 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.710 183755 DEBUG oslo_concurrency.lockutils [req-15e22c3c-e1e3-48c9-b211-d8907ce92362 req-10e1bf2f-823c-429b-b409-71343300ce3b 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.710 183755 DEBUG oslo_concurrency.lockutils [req-15e22c3c-e1e3-48c9-b211-d8907ce92362 req-10e1bf2f-823c-429b-b409-71343300ce3b 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.711 183755 DEBUG nova.compute.manager [req-15e22c3c-e1e3-48c9-b211-d8907ce92362 req-10e1bf2f-823c-429b-b409-71343300ce3b 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] No waiting events found dispatching network-vif-unplugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.711 183755 DEBUG nova.compute.manager [req-15e22c3c-e1e3-48c9-b211-d8907ce92362 req-10e1bf2f-823c-429b-b409-71343300ce3b 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-vif-unplugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:20:50 compute-1 nova_compute[183751]: 2026-01-27 22:20:50.773 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.055 183755 INFO nova.compute.manager [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.056 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.056 183755 DEBUG nova.compute.manager [-] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.057 183755 DEBUG nova.network.neutron [-] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.057 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.511 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.833 183755 DEBUG nova.compute.manager [req-56442d11-d48e-4ac7-a99c-5ef7f3369753 req-5ee99008-2175-415b-8634-52de0c829ea4 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-vif-deleted-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.834 183755 INFO nova.compute.manager [req-56442d11-d48e-4ac7-a99c-5ef7f3369753 req-5ee99008-2175-415b-8634-52de0c829ea4 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Neutron deleted interface 2c1ce5da-5066-473b-8e3b-41b4f5ec32ff; detaching it from the instance and deleting it from the info cache
Jan 27 22:20:51 compute-1 nova_compute[183751]: 2026-01-27 22:20:51.835 183755 DEBUG nova.network.neutron [req-56442d11-d48e-4ac7-a99c-5ef7f3369753 req-5ee99008-2175-415b-8634-52de0c829ea4 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.279 183755 DEBUG nova.network.neutron [-] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.344 183755 DEBUG nova.compute.manager [req-56442d11-d48e-4ac7-a99c-5ef7f3369753 req-5ee99008-2175-415b-8634-52de0c829ea4 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Detach interface failed, port_id=2c1ce5da-5066-473b-8e3b-41b4f5ec32ff, reason: Instance 6382ecba-36e3-4f7d-81c9-3951a047b87f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.786 183755 INFO nova.compute.manager [-] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Took 1.73 seconds to deallocate network for instance.
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.803 183755 DEBUG nova.compute.manager [req-845a1a6f-f21e-4973-91bd-70a32fe515a2 req-d20c0cf3-82b0-4c7c-b085-e03119bd9714 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-vif-unplugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.804 183755 DEBUG oslo_concurrency.lockutils [req-845a1a6f-f21e-4973-91bd-70a32fe515a2 req-d20c0cf3-82b0-4c7c-b085-e03119bd9714 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.804 183755 DEBUG oslo_concurrency.lockutils [req-845a1a6f-f21e-4973-91bd-70a32fe515a2 req-d20c0cf3-82b0-4c7c-b085-e03119bd9714 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.805 183755 DEBUG oslo_concurrency.lockutils [req-845a1a6f-f21e-4973-91bd-70a32fe515a2 req-d20c0cf3-82b0-4c7c-b085-e03119bd9714 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.805 183755 DEBUG nova.compute.manager [req-845a1a6f-f21e-4973-91bd-70a32fe515a2 req-d20c0cf3-82b0-4c7c-b085-e03119bd9714 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] No waiting events found dispatching network-vif-unplugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:20:52 compute-1 nova_compute[183751]: 2026-01-27 22:20:52.805 183755 DEBUG nova.compute.manager [req-845a1a6f-f21e-4973-91bd-70a32fe515a2 req-d20c0cf3-82b0-4c7c-b085-e03119bd9714 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 6382ecba-36e3-4f7d-81c9-3951a047b87f] Received event network-vif-unplugged-2c1ce5da-5066-473b-8e3b-41b4f5ec32ff for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.319 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.320 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.390 183755 DEBUG nova.compute.provider_tree [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.920 183755 ERROR nova.scheduler.client.report [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] [req-3b99adf7-fded-42b5-b451-8cda8d091fa2] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 18406e9c-09cc-4d76-bc69-d3d1c0683e05.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-3b99adf7-fded-42b5-b451-8cda8d091fa2"}]}
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.939 183755 DEBUG nova.scheduler.client.report [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.967 183755 DEBUG nova.scheduler.client.report [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.967 183755 DEBUG nova.compute.provider_tree [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:20:53 compute-1 nova_compute[183751]: 2026-01-27 22:20:53.983 183755 DEBUG nova.scheduler.client.report [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:20:54 compute-1 nova_compute[183751]: 2026-01-27 22:20:54.013 183755 DEBUG nova.scheduler.client.report [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:20:54 compute-1 nova_compute[183751]: 2026-01-27 22:20:54.060 183755 DEBUG nova.compute.provider_tree [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:20:54 compute-1 nova_compute[183751]: 2026-01-27 22:20:54.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:54 compute-1 nova_compute[183751]: 2026-01-27 22:20:54.630 183755 DEBUG nova.scheduler.client.report [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Updated inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Jan 27 22:20:54 compute-1 nova_compute[183751]: 2026-01-27 22:20:54.630 183755 DEBUG nova.compute.provider_tree [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Updating resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Jan 27 22:20:54 compute-1 nova_compute[183751]: 2026-01-27 22:20:54.631 183755 DEBUG nova.compute.provider_tree [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:20:54 compute-1 nova_compute[183751]: 2026-01-27 22:20:54.659 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.143 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.823s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.148 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.488s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.148 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.148 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.180 183755 INFO nova.scheduler.client.report [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Deleted allocations for instance 6382ecba-36e3-4f7d-81c9-3951a047b87f
Jan 27 22:20:55 compute-1 podman[215651]: 2026-01-27 22:20:55.36932705 +0000 UTC m=+0.164225526 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS)
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.383 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.385 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.424 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.425 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5806MB free_disk=73.14280700683594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.425 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.425 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.563 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:55 compute-1 nova_compute[183751]: 2026-01-27 22:20:55.775 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:20:56 compute-1 nova_compute[183751]: 2026-01-27 22:20:56.208 183755 DEBUG oslo_concurrency.lockutils [None req-82c4b5ff-59d1-42d9-8e85-bcd3132250c4 fcaa4f96208c43aa80de7acda91b9da8 dbb53a6bca6547238f0b67bd582b0ea5 - - default default] Lock "6382ecba-36e3-4f7d-81c9-3951a047b87f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:56 compute-1 nova_compute[183751]: 2026-01-27 22:20:56.476 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:20:56 compute-1 nova_compute[183751]: 2026-01-27 22:20:56.476 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:20:55 up  2:23,  0 user,  load average: 0.46, 0.22, 0.08\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:20:56 compute-1 nova_compute[183751]: 2026-01-27 22:20:56.509 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:20:57 compute-1 nova_compute[183751]: 2026-01-27 22:20:57.018 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:20:57 compute-1 nova_compute[183751]: 2026-01-27 22:20:57.530 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:20:57 compute-1 nova_compute[183751]: 2026-01-27 22:20:57.531 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.105s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:20:58 compute-1 nova_compute[183751]: 2026-01-27 22:20:58.532 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:59 compute-1 nova_compute[183751]: 2026-01-27 22:20:59.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:59 compute-1 nova_compute[183751]: 2026-01-27 22:20:59.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:20:59 compute-1 nova_compute[183751]: 2026-01-27 22:20:59.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:20:59 compute-1 podman[215679]: 2026-01-27 22:20:59.792153846 +0000 UTC m=+0.080863563 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 22:20:59 compute-1 podman[215678]: 2026-01-27 22:20:59.809914896 +0000 UTC m=+0.115359827 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:21:00 compute-1 nova_compute[183751]: 2026-01-27 22:21:00.567 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:00 compute-1 nova_compute[183751]: 2026-01-27 22:21:00.776 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:05 compute-1 nova_compute[183751]: 2026-01-27 22:21:05.569 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:05 compute-1 podman[193064]: time="2026-01-27T22:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:21:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:21:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Jan 27 22:21:05 compute-1 nova_compute[183751]: 2026-01-27 22:21:05.778 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:09 compute-1 podman[215717]: 2026-01-27 22:21:09.822641127 +0000 UTC m=+0.120958655 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:21:10 compute-1 nova_compute[183751]: 2026-01-27 22:21:10.572 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:10 compute-1 nova_compute[183751]: 2026-01-27 22:21:10.781 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:11.261 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:21:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:11.261 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:21:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:11.261 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:21:11 compute-1 nova_compute[183751]: 2026-01-27 22:21:11.855 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:15 compute-1 nova_compute[183751]: 2026-01-27 22:21:15.574 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:15 compute-1 nova_compute[183751]: 2026-01-27 22:21:15.783 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:19 compute-1 openstack_network_exporter[195945]: ERROR   22:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:21:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:21:19 compute-1 openstack_network_exporter[195945]: ERROR   22:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:21:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:21:20 compute-1 nova_compute[183751]: 2026-01-27 22:21:20.576 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:20 compute-1 nova_compute[183751]: 2026-01-27 22:21:20.786 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:22.355 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:74:e0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7a759a84-186e-4361-8682-406ee1d868b5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a759a84-186e-4361-8682-406ee1d868b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d532be863cf418db3d406ec1b756811', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2faa89f2-04ff-4922-829a-ab6eff55c5c3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fa521dbb-6b22-4fb8-b4ad-9b2c2d675879) old=Port_Binding(mac=['fa:16:3e:92:74:e0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7a759a84-186e-4361-8682-406ee1d868b5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a759a84-186e-4361-8682-406ee1d868b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d532be863cf418db3d406ec1b756811', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:21:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:22.356 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fa521dbb-6b22-4fb8-b4ad-9b2c2d675879 in datapath 7a759a84-186e-4361-8682-406ee1d868b5 updated
Jan 27 22:21:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:22.357 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a759a84-186e-4361-8682-406ee1d868b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:21:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:22.358 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6ed0caf7-fb52-4317-9a21-843c8245cc62]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:21:25 compute-1 nova_compute[183751]: 2026-01-27 22:21:25.578 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:25 compute-1 nova_compute[183751]: 2026-01-27 22:21:25.831 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:25 compute-1 podman[215742]: 2026-01-27 22:21:25.891814799 +0000 UTC m=+0.148441776 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 27 22:21:30 compute-1 nova_compute[183751]: 2026-01-27 22:21:30.629 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:30 compute-1 podman[215769]: 2026-01-27 22:21:30.783807568 +0000 UTC m=+0.087587939 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 27 22:21:30 compute-1 podman[215768]: 2026-01-27 22:21:30.806252434 +0000 UTC m=+0.108963258 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:21:30 compute-1 nova_compute[183751]: 2026-01-27 22:21:30.834 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:33.542 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:c7:57 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9df500eb-8cb2-4be5-8baf-83da79212744', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9df500eb-8cb2-4be5-8baf-83da79212744', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce4df730ec994d26b350ac48078bd39b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5b44774a-60e6-4b7c-9fd4-d3164d1d4bd9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1efedee0-9e0f-43f1-9ce9-3ff20e0fe05d) old=Port_Binding(mac=['fa:16:3e:f2:c7:57'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-9df500eb-8cb2-4be5-8baf-83da79212744', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9df500eb-8cb2-4be5-8baf-83da79212744', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce4df730ec994d26b350ac48078bd39b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:21:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:33.545 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1efedee0-9e0f-43f1-9ce9-3ff20e0fe05d in datapath 9df500eb-8cb2-4be5-8baf-83da79212744 updated
Jan 27 22:21:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:33.546 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9df500eb-8cb2-4be5-8baf-83da79212744, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:21:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:33.547 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f1222ff8-1c75-4e20-8f7f-e38b58a80d1c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:21:35 compute-1 podman[193064]: time="2026-01-27T22:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:21:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:21:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 27 22:21:35 compute-1 nova_compute[183751]: 2026-01-27 22:21:35.678 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:35 compute-1 nova_compute[183751]: 2026-01-27 22:21:35.837 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:38 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:38.671 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:21:38 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:38.672 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:21:38 compute-1 nova_compute[183751]: 2026-01-27 22:21:38.675 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:40 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:21:40.674 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:21:40 compute-1 nova_compute[183751]: 2026-01-27 22:21:40.680 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:40 compute-1 podman[215811]: 2026-01-27 22:21:40.771809067 +0000 UTC m=+0.072307511 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:21:40 compute-1 nova_compute[183751]: 2026-01-27 22:21:40.838 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:45 compute-1 ovn_controller[95969]: 2026-01-27T22:21:45Z|00048|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 27 22:21:45 compute-1 nova_compute[183751]: 2026-01-27 22:21:45.682 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:45 compute-1 nova_compute[183751]: 2026-01-27 22:21:45.840 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:48 compute-1 sshd-session[215836]: Received disconnect from 91.224.92.190 port 61998:11:  [preauth]
Jan 27 22:21:48 compute-1 sshd-session[215836]: Disconnected from authenticating user root 91.224.92.190 port 61998 [preauth]
Jan 27 22:21:49 compute-1 openstack_network_exporter[195945]: ERROR   22:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:21:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:21:49 compute-1 openstack_network_exporter[195945]: ERROR   22:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:21:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:21:50 compute-1 nova_compute[183751]: 2026-01-27 22:21:50.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:21:50 compute-1 nova_compute[183751]: 2026-01-27 22:21:50.684 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:50 compute-1 nova_compute[183751]: 2026-01-27 22:21:50.843 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:51 compute-1 nova_compute[183751]: 2026-01-27 22:21:51.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:21:53 compute-1 nova_compute[183751]: 2026-01-27 22:21:53.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:21:53 compute-1 nova_compute[183751]: 2026-01-27 22:21:53.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.675 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.676 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.677 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.677 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.686 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.846 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.893 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.894 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.917 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.918 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5877MB free_disk=73.14282608032227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.918 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:21:55 compute-1 nova_compute[183751]: 2026-01-27 22:21:55.919 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:21:56 compute-1 podman[215840]: 2026-01-27 22:21:56.793926826 +0000 UTC m=+0.101990896 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 22:21:57 compute-1 nova_compute[183751]: 2026-01-27 22:21:57.101 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:21:57 compute-1 nova_compute[183751]: 2026-01-27 22:21:57.102 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:21:55 up  2:24,  0 user,  load average: 0.39, 0.23, 0.09\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:21:57 compute-1 nova_compute[183751]: 2026-01-27 22:21:57.292 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:21:57 compute-1 nova_compute[183751]: 2026-01-27 22:21:57.803 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:21:58 compute-1 nova_compute[183751]: 2026-01-27 22:21:58.320 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:21:58 compute-1 nova_compute[183751]: 2026-01-27 22:21:58.320 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.402s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:21:58 compute-1 nova_compute[183751]: 2026-01-27 22:21:58.321 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:21:58 compute-1 nova_compute[183751]: 2026-01-27 22:21:58.321 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:21:58 compute-1 nova_compute[183751]: 2026-01-27 22:21:58.830 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:22:00 compute-1 nova_compute[183751]: 2026-01-27 22:22:00.689 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:00 compute-1 nova_compute[183751]: 2026-01-27 22:22:00.849 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:01 compute-1 podman[215867]: 2026-01-27 22:22:01.767677109 +0000 UTC m=+0.067610655 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:22:01 compute-1 podman[215866]: 2026-01-27 22:22:01.790243978 +0000 UTC m=+0.089535918 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., maintainer=Red Hat, Inc.)
Jan 27 22:22:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:02.221 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:b1:8d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '969a674fc959431aa6fa4699c13a5d15', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50cc5574-cb7b-4d29-9875-3de476f6a1f0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2cdf8180-4a36-4184-8a61-29080b79cf3c) old=Port_Binding(mac=['fa:16:3e:3e:b1:8d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '969a674fc959431aa6fa4699c13a5d15', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:22:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:02.222 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2cdf8180-4a36-4184-8a61-29080b79cf3c in datapath 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af updated
Jan 27 22:22:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:02.223 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:22:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:02.224 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[aca5df0c-9158-4970-8b15-e26ce3ff2edd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:22:03 compute-1 nova_compute[183751]: 2026-01-27 22:22:03.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:03 compute-1 nova_compute[183751]: 2026-01-27 22:22:03.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:03 compute-1 nova_compute[183751]: 2026-01-27 22:22:03.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:22:03 compute-1 nova_compute[183751]: 2026-01-27 22:22:03.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:03 compute-1 nova_compute[183751]: 2026-01-27 22:22:03.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:22:05 compute-1 podman[193064]: time="2026-01-27T22:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:22:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:22:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 27 22:22:05 compute-1 nova_compute[183751]: 2026-01-27 22:22:05.691 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:05 compute-1 nova_compute[183751]: 2026-01-27 22:22:05.852 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:10 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:10.312 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:ec:07 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-be42c1ac-97f5-4099-b22a-e0a11abe7fb8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be42c1ac-97f5-4099-b22a-e0a11abe7fb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c00680f450794ece8c21a7d0ab0378c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d060e3f8-e7db-4f06-842b-a459c38e0506, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ebd4c641-b8a3-4f76-b632-04781ad71d51) old=Port_Binding(mac=['fa:16:3e:cd:ec:07'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-be42c1ac-97f5-4099-b22a-e0a11abe7fb8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-be42c1ac-97f5-4099-b22a-e0a11abe7fb8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c00680f450794ece8c21a7d0ab0378c0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:22:10 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:10.313 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ebd4c641-b8a3-4f76-b632-04781ad71d51 in datapath be42c1ac-97f5-4099-b22a-e0a11abe7fb8 updated
Jan 27 22:22:10 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:10.314 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network be42c1ac-97f5-4099-b22a-e0a11abe7fb8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:22:10 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:10.315 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f3c932-b559-44bf-99b0-f7a7635b9037]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:22:10 compute-1 nova_compute[183751]: 2026-01-27 22:22:10.694 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:10 compute-1 nova_compute[183751]: 2026-01-27 22:22:10.855 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:11.262 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:11.263 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:11.263 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:22:11 compute-1 podman[215908]: 2026-01-27 22:22:11.806308071 +0000 UTC m=+0.111370898 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:22:15 compute-1 nova_compute[183751]: 2026-01-27 22:22:15.696 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:15 compute-1 nova_compute[183751]: 2026-01-27 22:22:15.856 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:16 compute-1 nova_compute[183751]: 2026-01-27 22:22:16.650 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:19 compute-1 openstack_network_exporter[195945]: ERROR   22:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:22:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:22:19 compute-1 openstack_network_exporter[195945]: ERROR   22:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:22:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:22:20 compute-1 nova_compute[183751]: 2026-01-27 22:22:20.698 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:20 compute-1 nova_compute[183751]: 2026-01-27 22:22:20.859 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:24 compute-1 nova_compute[183751]: 2026-01-27 22:22:24.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:25 compute-1 nova_compute[183751]: 2026-01-27 22:22:25.701 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:25 compute-1 nova_compute[183751]: 2026-01-27 22:22:25.861 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:27 compute-1 podman[215936]: 2026-01-27 22:22:27.823517262 +0000 UTC m=+0.125873987 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 22:22:30 compute-1 nova_compute[183751]: 2026-01-27 22:22:30.705 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:30 compute-1 nova_compute[183751]: 2026-01-27 22:22:30.864 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:32 compute-1 podman[215964]: 2026-01-27 22:22:32.801193563 +0000 UTC m=+0.107733628 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Jan 27 22:22:32 compute-1 podman[215965]: 2026-01-27 22:22:32.823959867 +0000 UTC m=+0.117344856 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260126, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Jan 27 22:22:35 compute-1 podman[193064]: time="2026-01-27T22:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:22:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:22:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 27 22:22:35 compute-1 nova_compute[183751]: 2026-01-27 22:22:35.707 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:35 compute-1 nova_compute[183751]: 2026-01-27 22:22:35.866 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:40 compute-1 nova_compute[183751]: 2026-01-27 22:22:40.711 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:40 compute-1 nova_compute[183751]: 2026-01-27 22:22:40.869 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:42 compute-1 podman[216004]: 2026-01-27 22:22:42.782476777 +0000 UTC m=+0.087086807 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:22:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:43.393 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:22:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:43.395 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:22:43 compute-1 nova_compute[183751]: 2026-01-27 22:22:43.395 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:45 compute-1 nova_compute[183751]: 2026-01-27 22:22:45.713 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:45 compute-1 nova_compute[183751]: 2026-01-27 22:22:45.870 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:47 compute-1 nova_compute[183751]: 2026-01-27 22:22:47.531 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:22:49.397 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:22:49 compute-1 openstack_network_exporter[195945]: ERROR   22:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:22:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:22:49 compute-1 openstack_network_exporter[195945]: ERROR   22:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:22:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:22:50 compute-1 nova_compute[183751]: 2026-01-27 22:22:50.715 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:50 compute-1 nova_compute[183751]: 2026-01-27 22:22:50.872 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:51 compute-1 nova_compute[183751]: 2026-01-27 22:22:51.113 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:51 compute-1 nova_compute[183751]: 2026-01-27 22:22:51.114 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:51 compute-1 nova_compute[183751]: 2026-01-27 22:22:51.621 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:22:51 compute-1 nova_compute[183751]: 2026-01-27 22:22:51.660 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:52 compute-1 nova_compute[183751]: 2026-01-27 22:22:52.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:52 compute-1 nova_compute[183751]: 2026-01-27 22:22:52.192 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:52 compute-1 nova_compute[183751]: 2026-01-27 22:22:52.193 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:52 compute-1 nova_compute[183751]: 2026-01-27 22:22:52.200 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:22:52 compute-1 nova_compute[183751]: 2026-01-27 22:22:52.201 183755 INFO nova.compute.claims [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:22:53 compute-1 nova_compute[183751]: 2026-01-27 22:22:53.285 183755 DEBUG nova.compute.provider_tree [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:22:53 compute-1 nova_compute[183751]: 2026-01-27 22:22:53.794 183755 DEBUG nova.scheduler.client.report [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:22:54 compute-1 nova_compute[183751]: 2026-01-27 22:22:54.307 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:22:54 compute-1 nova_compute[183751]: 2026-01-27 22:22:54.309 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:22:54 compute-1 nova_compute[183751]: 2026-01-27 22:22:54.824 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:22:54 compute-1 nova_compute[183751]: 2026-01-27 22:22:54.824 183755 DEBUG nova.network.neutron [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:22:54 compute-1 nova_compute[183751]: 2026-01-27 22:22:54.825 183755 WARNING neutronclient.v2_0.client [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:22:54 compute-1 nova_compute[183751]: 2026-01-27 22:22:54.825 183755 WARNING neutronclient.v2_0.client [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:22:55 compute-1 nova_compute[183751]: 2026-01-27 22:22:55.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:55 compute-1 nova_compute[183751]: 2026-01-27 22:22:55.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:55 compute-1 nova_compute[183751]: 2026-01-27 22:22:55.333 183755 INFO nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:22:55 compute-1 nova_compute[183751]: 2026-01-27 22:22:55.717 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:55 compute-1 nova_compute[183751]: 2026-01-27 22:22:55.848 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:22:55 compute-1 nova_compute[183751]: 2026-01-27 22:22:55.876 183755 DEBUG nova.network.neutron [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Successfully created port: 85b9ff01-2789-4532-963a-76dbf2aba33a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:22:55 compute-1 nova_compute[183751]: 2026-01-27 22:22:55.879 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.676 183755 DEBUG nova.network.neutron [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Successfully updated port: 85b9ff01-2789-4532-963a-76dbf2aba33a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.760 183755 DEBUG nova.compute.manager [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-changed-85b9ff01-2789-4532-963a-76dbf2aba33a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.761 183755 DEBUG nova.compute.manager [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Refreshing instance network info cache due to event network-changed-85b9ff01-2789-4532-963a-76dbf2aba33a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.761 183755 DEBUG oslo_concurrency.lockutils [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-0a7c153e-00ae-4ad9-b073-e98bf81d6e23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.761 183755 DEBUG oslo_concurrency.lockutils [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-0a7c153e-00ae-4ad9-b073-e98bf81d6e23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.762 183755 DEBUG nova.network.neutron [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Refreshing network info cache for port 85b9ff01-2789-4532-963a-76dbf2aba33a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.871 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.872 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.873 183755 INFO nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Creating image(s)
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.874 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "/var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.874 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "/var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.875 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "/var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.876 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.880 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.882 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.948 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.950 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.951 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.953 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.960 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:22:56 compute-1 nova_compute[183751]: 2026-01-27 22:22:56.961 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.016 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.018 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.056 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.058 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.058 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.115 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.116 183755 DEBUG nova.virt.disk.api [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Checking if we can resize image /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.117 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.173 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.174 183755 DEBUG nova.virt.disk.api [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Cannot resize image /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.175 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.175 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Ensure instance console log exists: /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.176 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.177 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.177 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.184 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "refresh_cache-0a7c153e-00ae-4ad9-b073-e98bf81d6e23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.267 183755 WARNING neutronclient.v2_0.client [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.667 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.895 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.897 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.924 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.925 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5877MB free_disk=73.14274597167969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.926 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:22:57 compute-1 nova_compute[183751]: 2026-01-27 22:22:57.927 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:22:58 compute-1 nova_compute[183751]: 2026-01-27 22:22:58.141 183755 DEBUG nova.network.neutron [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:22:58 compute-1 podman[216045]: 2026-01-27 22:22:58.815601199 +0000 UTC m=+0.118040993 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.021 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Instance 0a7c153e-00ae-4ad9-b073-e98bf81d6e23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.021 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.022 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:22:57 up  2:25,  0 user,  load average: 0.19, 0.20, 0.09\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_c00680f450794ece8c21a7d0ab0378c0': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.061 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.109 183755 DEBUG nova.network.neutron [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.568 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.616 183755 DEBUG oslo_concurrency.lockutils [req-b2a5befd-f341-4270-8c39-89275eef1409 req-021364ab-1aaf-4d51-9726-e295eafc4147 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-0a7c153e-00ae-4ad9-b073-e98bf81d6e23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.617 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquired lock "refresh_cache-0a7c153e-00ae-4ad9-b073-e98bf81d6e23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:22:59 compute-1 nova_compute[183751]: 2026-01-27 22:22:59.617 183755 DEBUG nova.network.neutron [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:23:00 compute-1 nova_compute[183751]: 2026-01-27 22:23:00.084 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:23:00 compute-1 nova_compute[183751]: 2026-01-27 22:23:00.085 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:00 compute-1 nova_compute[183751]: 2026-01-27 22:23:00.270 183755 DEBUG nova.network.neutron [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:23:00 compute-1 nova_compute[183751]: 2026-01-27 22:23:00.719 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:00 compute-1 nova_compute[183751]: 2026-01-27 22:23:00.878 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:01 compute-1 nova_compute[183751]: 2026-01-27 22:23:01.229 183755 WARNING neutronclient.v2_0.client [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:23:01 compute-1 nova_compute[183751]: 2026-01-27 22:23:01.532 183755 DEBUG nova.network.neutron [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Updating instance_info_cache with network_info: [{"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.041 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Releasing lock "refresh_cache-0a7c153e-00ae-4ad9-b073-e98bf81d6e23" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.042 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Instance network_info: |[{"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.045 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Start _get_guest_xml network_info=[{"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.050 183755 WARNING nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.052 183755 DEBUG nova.virt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-573176118', uuid='0a7c153e-00ae-4ad9-b073-e98bf81d6e23'), owner=OwnerMeta(userid='9483f5753319427f8ad7c898bb549210', username='tempest-TestExecuteBasicStrategy-483102978-project-admin', projectid='c00680f450794ece8c21a7d0ab0378c0', projectname='tempest-TestExecuteBasicStrategy-483102978'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769552582.0525918) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.058 183755 DEBUG nova.virt.libvirt.host [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.059 183755 DEBUG nova.virt.libvirt.host [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.063 183755 DEBUG nova.virt.libvirt.host [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.064 183755 DEBUG nova.virt.libvirt.host [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.066 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.067 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.067 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.068 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.068 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.069 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.069 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.070 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.070 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.071 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.071 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.072 183755 DEBUG nova.virt.hardware [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.078 183755 DEBUG nova.virt.libvirt.vif [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:22:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-573176118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-573176118',id=4,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c00680f450794ece8c21a7d0ab0378c0',ramdisk_id='',reservation_id='r-mp000bdq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,member,reader',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-483102978',owner_user_name='tempest-TestExecuteBasicStrategy-483102978-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:22:55Z,user_data=None,user_id='9483f5753319427f8ad7c898bb549210',uuid=0a7c153e-00ae-4ad9-b073-e98bf81d6e23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.078 183755 DEBUG nova.network.os_vif_util [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Converting VIF {"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.080 183755 DEBUG nova.network.os_vif_util [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:54:10,bridge_name='br-int',has_traffic_filtering=True,id=85b9ff01-2789-4532-963a-76dbf2aba33a,network=Network(36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85b9ff01-27') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.083 183755 DEBUG nova.objects.instance [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a7c153e-00ae-4ad9-b073-e98bf81d6e23 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.594 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <uuid>0a7c153e-00ae-4ad9-b073-e98bf81d6e23</uuid>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <name>instance-00000004</name>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteBasicStrategy-server-573176118</nova:name>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:23:02</nova:creationTime>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:23:02 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:23:02 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:user uuid="9483f5753319427f8ad7c898bb549210">tempest-TestExecuteBasicStrategy-483102978-project-admin</nova:user>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:project uuid="c00680f450794ece8c21a7d0ab0378c0">tempest-TestExecuteBasicStrategy-483102978</nova:project>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         <nova:port uuid="85b9ff01-2789-4532-963a-76dbf2aba33a">
Jan 27 22:23:02 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <system>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <entry name="serial">0a7c153e-00ae-4ad9-b073-e98bf81d6e23</entry>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <entry name="uuid">0a7c153e-00ae-4ad9-b073-e98bf81d6e23</entry>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </system>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <os>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   </os>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <features>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   </features>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk.config"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:9a:54:10"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <target dev="tap85b9ff01-27"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/console.log" append="off"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <video>
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </video>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:23:02 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:23:02 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:23:02 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:23:02 compute-1 nova_compute[183751]: </domain>
Jan 27 22:23:02 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.595 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Preparing to wait for external event network-vif-plugged-85b9ff01-2789-4532-963a-76dbf2aba33a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.595 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.596 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.596 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.597 183755 DEBUG nova.virt.libvirt.vif [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:22:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-573176118',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-573176118',id=4,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c00680f450794ece8c21a7d0ab0378c0',ramdisk_id='',reservation_id='r-mp000bdq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,member,reader',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-483102978',owner_user_name='tempest-TestExecuteBasicStrategy-483102978-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:22:55Z,user_data=None,user_id='9483f5753319427f8ad7c898bb549210',uuid=0a7c153e-00ae-4ad9-b073-e98bf81d6e23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.597 183755 DEBUG nova.network.os_vif_util [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Converting VIF {"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.598 183755 DEBUG nova.network.os_vif_util [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:54:10,bridge_name='br-int',has_traffic_filtering=True,id=85b9ff01-2789-4532-963a-76dbf2aba33a,network=Network(36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85b9ff01-27') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.599 183755 DEBUG os_vif [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:54:10,bridge_name='br-int',has_traffic_filtering=True,id=85b9ff01-2789-4532-963a-76dbf2aba33a,network=Network(36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85b9ff01-27') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.600 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.600 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.601 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.602 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.602 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e7e087df-81cc-57f5-ab1b-e320790260f2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.644 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.646 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.650 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.650 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85b9ff01-27, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.651 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap85b9ff01-27, col_values=(('qos', UUID('c9fa30ba-dbf0-4bfe-a64a-09c85b1cc49b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.651 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap85b9ff01-27, col_values=(('external_ids', {'iface-id': '85b9ff01-2789-4532-963a-76dbf2aba33a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:54:10', 'vm-uuid': '0a7c153e-00ae-4ad9-b073-e98bf81d6e23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.653 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:02 compute-1 NetworkManager[56069]: <info>  [1769552582.6544] manager: (tap85b9ff01-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.657 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.661 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:02 compute-1 nova_compute[183751]: 2026-01-27 22:23:02.662 183755 INFO os_vif [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:54:10,bridge_name='br-int',has_traffic_filtering=True,id=85b9ff01-2789-4532-963a-76dbf2aba33a,network=Network(36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85b9ff01-27')
Jan 27 22:23:03 compute-1 nova_compute[183751]: 2026-01-27 22:23:03.084 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:23:03 compute-1 nova_compute[183751]: 2026-01-27 22:23:03.085 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:23:03 compute-1 nova_compute[183751]: 2026-01-27 22:23:03.085 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:23:03 compute-1 podman[216074]: 2026-01-27 22:23:03.772801693 +0000 UTC m=+0.080673798 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126)
Jan 27 22:23:03 compute-1 podman[216073]: 2026-01-27 22:23:03.786126293 +0000 UTC m=+0.094395938 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:23:04 compute-1 nova_compute[183751]: 2026-01-27 22:23:04.225 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:23:04 compute-1 nova_compute[183751]: 2026-01-27 22:23:04.225 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:23:04 compute-1 nova_compute[183751]: 2026-01-27 22:23:04.225 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] No VIF found with MAC fa:16:3e:9a:54:10, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:23:04 compute-1 nova_compute[183751]: 2026-01-27 22:23:04.226 183755 INFO nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Using config drive
Jan 27 22:23:04 compute-1 nova_compute[183751]: 2026-01-27 22:23:04.747 183755 WARNING neutronclient.v2_0.client [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.262 183755 INFO nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Creating config drive at /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk.config
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.272 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp26c9kbtt execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.410 183755 DEBUG oslo_concurrency.processutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp26c9kbtt" returned: 0 in 0.138s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:23:05 compute-1 kernel: tap85b9ff01-27: entered promiscuous mode
Jan 27 22:23:05 compute-1 ovn_controller[95969]: 2026-01-27T22:23:05Z|00049|binding|INFO|Claiming lport 85b9ff01-2789-4532-963a-76dbf2aba33a for this chassis.
Jan 27 22:23:05 compute-1 ovn_controller[95969]: 2026-01-27T22:23:05Z|00050|binding|INFO|85b9ff01-2789-4532-963a-76dbf2aba33a: Claiming fa:16:3e:9a:54:10 10.100.0.7
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.500 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:05 compute-1 NetworkManager[56069]: <info>  [1769552585.5037] manager: (tap85b9ff01-27): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.506 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.511 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.523 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:54:10 10.100.0.7'], port_security=['fa:16:3e:9a:54:10 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0a7c153e-00ae-4ad9-b073-e98bf81d6e23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c00680f450794ece8c21a7d0ab0378c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba3f7dd4-ddf4-40bb-9daf-0a63e367e0ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50cc5574-cb7b-4d29-9875-3de476f6a1f0, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=85b9ff01-2789-4532-963a-76dbf2aba33a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.524 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 85b9ff01-2789-4532-963a-76dbf2aba33a in datapath 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af bound to our chassis
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.527 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af
Jan 27 22:23:05 compute-1 systemd-udevd[216132]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.543 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ae13608b-9e63-4e91-a11d-63b5d8793938]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.544 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap36f2a4aa-61 in ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.550 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap36f2a4aa-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.551 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b967b273-65e5-49f8-9321-67717f7a0d0e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.552 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[2d6bfdfe-d553-4812-b644-86f9fda20d6b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 systemd-machined[155034]: New machine qemu-2-instance-00000004.
Jan 27 22:23:05 compute-1 NetworkManager[56069]: <info>  [1769552585.5634] device (tap85b9ff01-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:23:05 compute-1 NetworkManager[56069]: <info>  [1769552585.5646] device (tap85b9ff01-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.572 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[701e58e2-4830-4dcc-a26f-80ebd7c9bf5d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.594 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[91b2cfb7-7f72-4ec8-b037-2d0f55f1e289]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.595 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:05 compute-1 ovn_controller[95969]: 2026-01-27T22:23:05Z|00051|binding|INFO|Setting lport 85b9ff01-2789-4532-963a-76dbf2aba33a ovn-installed in OVS
Jan 27 22:23:05 compute-1 ovn_controller[95969]: 2026-01-27T22:23:05Z|00052|binding|INFO|Setting lport 85b9ff01-2789-4532-963a-76dbf2aba33a up in Southbound
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.609 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.627 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[3159b118-11f0-4aeb-9b64-7dc189793b2d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 NetworkManager[56069]: <info>  [1769552585.6352] manager: (tap36f2a4aa-60): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.634 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[45761100-46dc-4463-a400-63487cd82c38]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 podman[193064]: time="2026-01-27T22:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:23:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:23:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.681 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[269623d7-d60c-4310-a117-9853cd2e8e6e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.685 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[528950e8-edac-4c79-8949-63d1bc3735fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 NetworkManager[56069]: <info>  [1769552585.7181] device (tap36f2a4aa-60): carrier: link connected
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.725 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[436a6d3e-2c71-464c-8742-721968f74127]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.755 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[387c54c7-8664-4869-b124-db3c9cba39f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap36f2a4aa-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:b1:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 872798, 'reachable_time': 19195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216165, 'error': None, 'target': 'ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.785 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5b3ce2-1dba-4053-a385-0aa63f64613e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:b18d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 872798, 'tstamp': 872798}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216166, 'error': None, 'target': 'ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.802 183755 DEBUG nova.compute.manager [req-403cdc97-4acc-41e5-9877-eca9f3267d34 req-682abdc2-e5a2-4d16-b135-4da7c3bcd925 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-vif-plugged-85b9ff01-2789-4532-963a-76dbf2aba33a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.803 183755 DEBUG oslo_concurrency.lockutils [req-403cdc97-4acc-41e5-9877-eca9f3267d34 req-682abdc2-e5a2-4d16-b135-4da7c3bcd925 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.803 183755 DEBUG oslo_concurrency.lockutils [req-403cdc97-4acc-41e5-9877-eca9f3267d34 req-682abdc2-e5a2-4d16-b135-4da7c3bcd925 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.804 183755 DEBUG oslo_concurrency.lockutils [req-403cdc97-4acc-41e5-9877-eca9f3267d34 req-682abdc2-e5a2-4d16-b135-4da7c3bcd925 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.804 183755 DEBUG nova.compute.manager [req-403cdc97-4acc-41e5-9877-eca9f3267d34 req-682abdc2-e5a2-4d16-b135-4da7c3bcd925 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Processing event network-vif-plugged-85b9ff01-2789-4532-963a-76dbf2aba33a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.817 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[0617932c-e756-41fe-8e5b-e6e55038fe39]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap36f2a4aa-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:b1:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 872798, 'reachable_time': 19195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216167, 'error': None, 'target': 'ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.869 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[18ef9edd-38e4-425d-a8ff-50ad82bbeee5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 nova_compute[183751]: 2026-01-27 22:23:05.880 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.963 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c4075a40-783c-42ea-b986-2857e811ee49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.964 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36f2a4aa-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.965 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:23:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:05.965 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36f2a4aa-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.000 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:06 compute-1 kernel: tap36f2a4aa-60: entered promiscuous mode
Jan 27 22:23:06 compute-1 NetworkManager[56069]: <info>  [1769552586.0011] manager: (tap36f2a4aa-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.004 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.004 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap36f2a4aa-60, col_values=(('external_ids', {'iface-id': '2cdf8180-4a36-4184-8a61-29080b79cf3c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.005 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:06 compute-1 ovn_controller[95969]: 2026-01-27T22:23:06Z|00053|binding|INFO|Releasing lport 2cdf8180-4a36-4184-8a61-29080b79cf3c from this chassis (sb_readonly=0)
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.030 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.033 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7db478-528c-47db-b8d0-e6b8d393eca8]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.034 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.034 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.035 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.035 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.036 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a8fd4365-7308-4c0c-a236-6a34044abba1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.037 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.038 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5949eda2-ee9a-4440-b301-481a1a65b4f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.038 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:23:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:06.042 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'env', 'PROCESS_TAG=haproxy-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.252 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.257 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.260 183755 INFO nova.virt.libvirt.driver [-] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Instance spawned successfully.
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.261 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:23:06 compute-1 podman[216206]: 2026-01-27 22:23:06.464198372 +0000 UTC m=+0.066419505 container create f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 27 22:23:06 compute-1 systemd[1]: Started libpod-conmon-f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44.scope.
Jan 27 22:23:06 compute-1 podman[216206]: 2026-01-27 22:23:06.423903915 +0000 UTC m=+0.026125028 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:23:06 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:23:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c204290b9b802f306a799929b0f5702f706d628b1bf3e0037630ecdbb855a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:23:06 compute-1 podman[216206]: 2026-01-27 22:23:06.551396401 +0000 UTC m=+0.153617574 container init f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:23:06 compute-1 podman[216206]: 2026-01-27 22:23:06.560976558 +0000 UTC m=+0.163197691 container start f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Jan 27 22:23:06 compute-1 neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af[216221]: [NOTICE]   (216225) : New worker (216227) forked
Jan 27 22:23:06 compute-1 neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af[216221]: [NOTICE]   (216225) : Loading success.
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.774 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.775 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.776 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.776 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.777 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:23:06 compute-1 nova_compute[183751]: 2026-01-27 22:23:06.777 183755 DEBUG nova.virt.libvirt.driver [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.288 183755 INFO nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Took 10.42 seconds to spawn the instance on the hypervisor.
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.290 183755 DEBUG nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.656 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.827 183755 INFO nova.compute.manager [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Took 15.69 seconds to build instance.
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.919 183755 DEBUG nova.compute.manager [req-7ecd7e5d-4d3a-4727-8948-a01e38055392 req-1403846c-83c9-403d-88c4-71c9b00301ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-vif-plugged-85b9ff01-2789-4532-963a-76dbf2aba33a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.920 183755 DEBUG oslo_concurrency.lockutils [req-7ecd7e5d-4d3a-4727-8948-a01e38055392 req-1403846c-83c9-403d-88c4-71c9b00301ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.920 183755 DEBUG oslo_concurrency.lockutils [req-7ecd7e5d-4d3a-4727-8948-a01e38055392 req-1403846c-83c9-403d-88c4-71c9b00301ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.921 183755 DEBUG oslo_concurrency.lockutils [req-7ecd7e5d-4d3a-4727-8948-a01e38055392 req-1403846c-83c9-403d-88c4-71c9b00301ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.921 183755 DEBUG nova.compute.manager [req-7ecd7e5d-4d3a-4727-8948-a01e38055392 req-1403846c-83c9-403d-88c4-71c9b00301ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] No waiting events found dispatching network-vif-plugged-85b9ff01-2789-4532-963a-76dbf2aba33a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:23:07 compute-1 nova_compute[183751]: 2026-01-27 22:23:07.922 183755 WARNING nova.compute.manager [req-7ecd7e5d-4d3a-4727-8948-a01e38055392 req-1403846c-83c9-403d-88c4-71c9b00301ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received unexpected event network-vif-plugged-85b9ff01-2789-4532-963a-76dbf2aba33a for instance with vm_state active and task_state None.
Jan 27 22:23:08 compute-1 nova_compute[183751]: 2026-01-27 22:23:08.336 183755 DEBUG oslo_concurrency.lockutils [None req-83f47f0e-9ea6-4de6-a7d6-0f6fab269e9c 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.221s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:10 compute-1 nova_compute[183751]: 2026-01-27 22:23:10.883 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:10 compute-1 nova_compute[183751]: 2026-01-27 22:23:10.900 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:10 compute-1 nova_compute[183751]: 2026-01-27 22:23:10.901 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:10 compute-1 nova_compute[183751]: 2026-01-27 22:23:10.901 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:10 compute-1 nova_compute[183751]: 2026-01-27 22:23:10.902 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:10 compute-1 nova_compute[183751]: 2026-01-27 22:23:10.902 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:10 compute-1 nova_compute[183751]: 2026-01-27 22:23:10.916 183755 INFO nova.compute.manager [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Terminating instance
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.264 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.264 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.265 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.432 183755 DEBUG nova.compute.manager [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:23:11 compute-1 kernel: tap85b9ff01-27 (unregistering): left promiscuous mode
Jan 27 22:23:11 compute-1 NetworkManager[56069]: <info>  [1769552591.4554] device (tap85b9ff01-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:23:11 compute-1 ovn_controller[95969]: 2026-01-27T22:23:11Z|00054|binding|INFO|Releasing lport 85b9ff01-2789-4532-963a-76dbf2aba33a from this chassis (sb_readonly=0)
Jan 27 22:23:11 compute-1 ovn_controller[95969]: 2026-01-27T22:23:11Z|00055|binding|INFO|Setting lport 85b9ff01-2789-4532-963a-76dbf2aba33a down in Southbound
Jan 27 22:23:11 compute-1 ovn_controller[95969]: 2026-01-27T22:23:11Z|00056|binding|INFO|Removing iface tap85b9ff01-27 ovn-installed in OVS
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.506 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.515 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:54:10 10.100.0.7'], port_security=['fa:16:3e:9a:54:10 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0a7c153e-00ae-4ad9-b073-e98bf81d6e23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c00680f450794ece8c21a7d0ab0378c0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ba3f7dd4-ddf4-40bb-9daf-0a63e367e0ef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50cc5574-cb7b-4d29-9875-3de476f6a1f0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=85b9ff01-2789-4532-963a-76dbf2aba33a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.517 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 85b9ff01-2789-4532-963a-76dbf2aba33a in datapath 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af unbound from our chassis
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.518 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.519 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[0ffc3fc9-0446-4d26-a23d-f5efca136beb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.520 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af namespace which is not needed anymore
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.526 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:11 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 27 22:23:11 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 5.923s CPU time.
Jan 27 22:23:11 compute-1 systemd-machined[155034]: Machine qemu-2-instance-00000004 terminated.
Jan 27 22:23:11 compute-1 neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af[216221]: [NOTICE]   (216225) : haproxy version is 3.0.5-8e879a5
Jan 27 22:23:11 compute-1 neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af[216221]: [NOTICE]   (216225) : path to executable is /usr/sbin/haproxy
Jan 27 22:23:11 compute-1 neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af[216221]: [WARNING]  (216225) : Exiting Master process...
Jan 27 22:23:11 compute-1 podman[216259]: 2026-01-27 22:23:11.657661085 +0000 UTC m=+0.038598017 container kill f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Jan 27 22:23:11 compute-1 neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af[216221]: [ALERT]    (216225) : Current worker (216227) exited with code 143 (Terminated)
Jan 27 22:23:11 compute-1 neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af[216221]: [WARNING]  (216225) : All workers exited. Exiting... (0)
Jan 27 22:23:11 compute-1 systemd[1]: libpod-f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44.scope: Deactivated successfully.
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.712 183755 INFO nova.virt.libvirt.driver [-] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Instance destroyed successfully.
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.713 183755 DEBUG nova.objects.instance [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lazy-loading 'resources' on Instance uuid 0a7c153e-00ae-4ad9-b073-e98bf81d6e23 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:23:11 compute-1 podman[216283]: 2026-01-27 22:23:11.72736685 +0000 UTC m=+0.028370303 container died f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 22:23:11 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44-userdata-shm.mount: Deactivated successfully.
Jan 27 22:23:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-e9c204290b9b802f306a799929b0f5702f706d628b1bf3e0037630ecdbb855a3-merged.mount: Deactivated successfully.
Jan 27 22:23:11 compute-1 podman[216283]: 2026-01-27 22:23:11.772339063 +0000 UTC m=+0.073342506 container remove f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Jan 27 22:23:11 compute-1 systemd[1]: libpod-conmon-f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44.scope: Deactivated successfully.
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.781 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f53ae781-4aa6-4ecc-9e5b-036eac72d844]: (4, ("Tue Jan 27 10:23:11 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af (f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44)\nf871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44\nTue Jan 27 10:23:11 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af (f871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44)\nf871c29045953f4b65600f06cfd1525a8465e02d60ea93a405163a234ca46f44\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.783 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[00f80881-8b7e-45f5-9850-04ebdaf0d464]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.784 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.784 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[86664850-b95d-4829-b621-073db10a7fd8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.785 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36f2a4aa-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.787 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:11 compute-1 kernel: tap36f2a4aa-60: left promiscuous mode
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.805 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:11 compute-1 nova_compute[183751]: 2026-01-27 22:23:11.806 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.807 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[9ffba954-02c6-4828-91cb-65ff8b83603c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.823 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[8aadad81-6606-486c-8e06-d262810fcbce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.824 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[0fef1234-3d0c-4e6c-b2bd-7d4fe712df7c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.850 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[128ed883-e8e7-4174-9f2b-c077882f9300]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 872788, 'reachable_time': 29101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216320, 'error': None, 'target': 'ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.853 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:23:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:11.853 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[2f695852-279c-4ce6-8dec-764373b8ec6b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:11 compute-1 systemd[1]: run-netns-ovnmeta\x2d36f2a4aa\x2d6aaf\x2d4a3f\x2d9507\x2dcfbf1eb8a4af.mount: Deactivated successfully.
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.221 183755 DEBUG nova.virt.libvirt.vif [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:22:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-573176118',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-573176118',id=4,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:23:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c00680f450794ece8c21a7d0ab0378c0',ramdisk_id='',reservation_id='r-mp000bdq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,admin,member,reader',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-483102978',owner_user_name='tempest-TestExecuteBasicStrategy-483102978-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:23:07Z,user_data=None,user_id='9483f5753319427f8ad7c898bb549210',uuid=0a7c153e-00ae-4ad9-b073-e98bf81d6e23,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.222 183755 DEBUG nova.network.os_vif_util [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Converting VIF {"id": "85b9ff01-2789-4532-963a-76dbf2aba33a", "address": "fa:16:3e:9a:54:10", "network": {"id": "36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1503795607-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "969a674fc959431aa6fa4699c13a5d15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85b9ff01-27", "ovs_interfaceid": "85b9ff01-2789-4532-963a-76dbf2aba33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.223 183755 DEBUG nova.network.os_vif_util [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:54:10,bridge_name='br-int',has_traffic_filtering=True,id=85b9ff01-2789-4532-963a-76dbf2aba33a,network=Network(36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85b9ff01-27') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.224 183755 DEBUG os_vif [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:54:10,bridge_name='br-int',has_traffic_filtering=True,id=85b9ff01-2789-4532-963a-76dbf2aba33a,network=Network(36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85b9ff01-27') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.227 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.228 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85b9ff01-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.230 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.232 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.233 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.233 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c9fa30ba-dbf0-4bfe-a64a-09c85b1cc49b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.234 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.236 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.238 183755 INFO os_vif [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:54:10,bridge_name='br-int',has_traffic_filtering=True,id=85b9ff01-2789-4532-963a-76dbf2aba33a,network=Network(36f2a4aa-6aaf-4a3f-9507-cfbf1eb8a4af),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85b9ff01-27')
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.239 183755 INFO nova.virt.libvirt.driver [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Deleting instance files /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23_del
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.241 183755 INFO nova.virt.libvirt.driver [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Deletion of /var/lib/nova/instances/0a7c153e-00ae-4ad9-b073-e98bf81d6e23_del complete
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.417 183755 DEBUG nova.compute.manager [req-edaac8d6-0cac-4f64-a253-ac9d0e87a948 req-42d1ed17-32f9-423a-a025-f0b35a9fc5ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-vif-unplugged-85b9ff01-2789-4532-963a-76dbf2aba33a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.418 183755 DEBUG oslo_concurrency.lockutils [req-edaac8d6-0cac-4f64-a253-ac9d0e87a948 req-42d1ed17-32f9-423a-a025-f0b35a9fc5ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.419 183755 DEBUG oslo_concurrency.lockutils [req-edaac8d6-0cac-4f64-a253-ac9d0e87a948 req-42d1ed17-32f9-423a-a025-f0b35a9fc5ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.419 183755 DEBUG oslo_concurrency.lockutils [req-edaac8d6-0cac-4f64-a253-ac9d0e87a948 req-42d1ed17-32f9-423a-a025-f0b35a9fc5ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.419 183755 DEBUG nova.compute.manager [req-edaac8d6-0cac-4f64-a253-ac9d0e87a948 req-42d1ed17-32f9-423a-a025-f0b35a9fc5ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] No waiting events found dispatching network-vif-unplugged-85b9ff01-2789-4532-963a-76dbf2aba33a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.420 183755 DEBUG nova.compute.manager [req-edaac8d6-0cac-4f64-a253-ac9d0e87a948 req-42d1ed17-32f9-423a-a025-f0b35a9fc5ec 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-vif-unplugged-85b9ff01-2789-4532-963a-76dbf2aba33a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.756 183755 INFO nova.compute.manager [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.757 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.757 183755 DEBUG nova.compute.manager [-] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.758 183755 DEBUG nova.network.neutron [-] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:23:12 compute-1 nova_compute[183751]: 2026-01-27 22:23:12.758 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:23:13 compute-1 podman[216321]: 2026-01-27 22:23:13.281463047 +0000 UTC m=+0.078008712 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:23:13 compute-1 nova_compute[183751]: 2026-01-27 22:23:13.424 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:23:13 compute-1 nova_compute[183751]: 2026-01-27 22:23:13.824 183755 DEBUG nova.compute.manager [req-787f52d2-5aec-40be-b63b-47ef992e8d47 req-7c8cbe06-92f1-4635-9638-c2c88634864a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-vif-deleted-85b9ff01-2789-4532-963a-76dbf2aba33a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:23:13 compute-1 nova_compute[183751]: 2026-01-27 22:23:13.824 183755 INFO nova.compute.manager [req-787f52d2-5aec-40be-b63b-47ef992e8d47 req-7c8cbe06-92f1-4635-9638-c2c88634864a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Neutron deleted interface 85b9ff01-2789-4532-963a-76dbf2aba33a; detaching it from the instance and deleting it from the info cache
Jan 27 22:23:13 compute-1 nova_compute[183751]: 2026-01-27 22:23:13.824 183755 DEBUG nova.network.neutron [req-787f52d2-5aec-40be-b63b-47ef992e8d47 req-7c8cbe06-92f1-4635-9638-c2c88634864a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.259 183755 DEBUG nova.network.neutron [-] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.334 183755 DEBUG nova.compute.manager [req-787f52d2-5aec-40be-b63b-47ef992e8d47 req-7c8cbe06-92f1-4635-9638-c2c88634864a 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Detach interface failed, port_id=85b9ff01-2789-4532-963a-76dbf2aba33a, reason: Instance 0a7c153e-00ae-4ad9-b073-e98bf81d6e23 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.541 183755 DEBUG nova.compute.manager [req-d79ab356-1e14-4e18-8efd-99f33f97ca02 req-11d8b085-636d-4947-99c3-2076c01420bc 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-vif-unplugged-85b9ff01-2789-4532-963a-76dbf2aba33a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.541 183755 DEBUG oslo_concurrency.lockutils [req-d79ab356-1e14-4e18-8efd-99f33f97ca02 req-11d8b085-636d-4947-99c3-2076c01420bc 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.542 183755 DEBUG oslo_concurrency.lockutils [req-d79ab356-1e14-4e18-8efd-99f33f97ca02 req-11d8b085-636d-4947-99c3-2076c01420bc 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.542 183755 DEBUG oslo_concurrency.lockutils [req-d79ab356-1e14-4e18-8efd-99f33f97ca02 req-11d8b085-636d-4947-99c3-2076c01420bc 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.543 183755 DEBUG nova.compute.manager [req-d79ab356-1e14-4e18-8efd-99f33f97ca02 req-11d8b085-636d-4947-99c3-2076c01420bc 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] No waiting events found dispatching network-vif-unplugged-85b9ff01-2789-4532-963a-76dbf2aba33a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.543 183755 DEBUG nova.compute.manager [req-d79ab356-1e14-4e18-8efd-99f33f97ca02 req-11d8b085-636d-4947-99c3-2076c01420bc 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Received event network-vif-unplugged-85b9ff01-2789-4532-963a-76dbf2aba33a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:23:14 compute-1 nova_compute[183751]: 2026-01-27 22:23:14.768 183755 INFO nova.compute.manager [-] [instance: 0a7c153e-00ae-4ad9-b073-e98bf81d6e23] Took 2.01 seconds to deallocate network for instance.
Jan 27 22:23:15 compute-1 nova_compute[183751]: 2026-01-27 22:23:15.296 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:15 compute-1 nova_compute[183751]: 2026-01-27 22:23:15.297 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:15 compute-1 nova_compute[183751]: 2026-01-27 22:23:15.358 183755 DEBUG nova.compute.provider_tree [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:23:15 compute-1 nova_compute[183751]: 2026-01-27 22:23:15.866 183755 DEBUG nova.scheduler.client.report [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:23:15 compute-1 nova_compute[183751]: 2026-01-27 22:23:15.884 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:16 compute-1 nova_compute[183751]: 2026-01-27 22:23:16.376 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.079s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:16 compute-1 nova_compute[183751]: 2026-01-27 22:23:16.418 183755 INFO nova.scheduler.client.report [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Deleted allocations for instance 0a7c153e-00ae-4ad9-b073-e98bf81d6e23
Jan 27 22:23:17 compute-1 nova_compute[183751]: 2026-01-27 22:23:17.235 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:17 compute-1 nova_compute[183751]: 2026-01-27 22:23:17.584 183755 DEBUG oslo_concurrency.lockutils [None req-7ad73595-8add-44a4-9498-e16b055e8d35 9483f5753319427f8ad7c898bb549210 c00680f450794ece8c21a7d0ab0378c0 - - default default] Lock "0a7c153e-00ae-4ad9-b073-e98bf81d6e23" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.683s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:19 compute-1 openstack_network_exporter[195945]: ERROR   22:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:23:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:23:19 compute-1 openstack_network_exporter[195945]: ERROR   22:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:23:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:23:20 compute-1 nova_compute[183751]: 2026-01-27 22:23:20.887 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:22 compute-1 nova_compute[183751]: 2026-01-27 22:23:22.238 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:25 compute-1 nova_compute[183751]: 2026-01-27 22:23:25.889 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:27 compute-1 nova_compute[183751]: 2026-01-27 22:23:27.240 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:29 compute-1 podman[216348]: 2026-01-27 22:23:29.817984989 +0000 UTC m=+0.128278065 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 22:23:30 compute-1 nova_compute[183751]: 2026-01-27 22:23:30.890 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:32 compute-1 nova_compute[183751]: 2026-01-27 22:23:32.244 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:34 compute-1 nova_compute[183751]: 2026-01-27 22:23:34.203 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:34 compute-1 podman[216375]: 2026-01-27 22:23:34.790278407 +0000 UTC m=+0.085999230 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 27 22:23:34 compute-1 podman[216374]: 2026-01-27 22:23:34.823196311 +0000 UTC m=+0.123480847 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9)
Jan 27 22:23:35 compute-1 podman[193064]: time="2026-01-27T22:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:23:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:23:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:23:35 compute-1 nova_compute[183751]: 2026-01-27 22:23:35.893 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:37 compute-1 nova_compute[183751]: 2026-01-27 22:23:37.247 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:40 compute-1 nova_compute[183751]: 2026-01-27 22:23:40.896 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:42 compute-1 nova_compute[183751]: 2026-01-27 22:23:42.249 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:43.630 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:23:43 compute-1 nova_compute[183751]: 2026-01-27 22:23:43.631 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:43.631 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:23:43 compute-1 podman[216414]: 2026-01-27 22:23:43.772951859 +0000 UTC m=+0.074539546 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:23:45 compute-1 nova_compute[183751]: 2026-01-27 22:23:45.898 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:47 compute-1 nova_compute[183751]: 2026-01-27 22:23:47.251 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:47 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:47.763 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:9d:07 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1970d2da-1fc4-4a0d-843a-a7525e8264bd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1970d2da-1fc4-4a0d-843a-a7525e8264bd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '425fcc32bc4444dcbf9062daaa860ba9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58d9b6db-e921-419e-add6-2eda1d68036b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2d6db684-caa6-42d5-a7bb-bf5f29ad1420) old=Port_Binding(mac=['fa:16:3e:78:9d:07'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1970d2da-1fc4-4a0d-843a-a7525e8264bd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1970d2da-1fc4-4a0d-843a-a7525e8264bd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '425fcc32bc4444dcbf9062daaa860ba9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:23:47 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:47.765 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2d6db684-caa6-42d5-a7bb-bf5f29ad1420 in datapath 1970d2da-1fc4-4a0d-843a-a7525e8264bd updated
Jan 27 22:23:47 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:47.766 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1970d2da-1fc4-4a0d-843a-a7525e8264bd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:23:47 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:47.767 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[81808086-0f88-4c3f-84db-931b19f4df50]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:49 compute-1 openstack_network_exporter[195945]: ERROR   22:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:23:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:23:49 compute-1 openstack_network_exporter[195945]: ERROR   22:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:23:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:23:50 compute-1 nova_compute[183751]: 2026-01-27 22:23:50.901 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:51 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:51.633 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:23:52 compute-1 nova_compute[183751]: 2026-01-27 22:23:52.253 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:53 compute-1 nova_compute[183751]: 2026-01-27 22:23:53.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:23:54 compute-1 nova_compute[183751]: 2026-01-27 22:23:54.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:23:55 compute-1 nova_compute[183751]: 2026-01-27 22:23:55.903 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:56 compute-1 nova_compute[183751]: 2026-01-27 22:23:56.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.255 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:23:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:57.349 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:b4:27 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1c6970fc-9975-4ceb-816f-5ff72bccd0e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c6970fc-9975-4ceb-816f-5ff72bccd0e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605992911e554da59a4c01e1d1a47882', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93edeba3-63bc-46c6-8738-1095d65f2d57, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=86acf75e-23c0-4967-be47-12ae34f9382b) old=Port_Binding(mac=['fa:16:3e:04:b4:27'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1c6970fc-9975-4ceb-816f-5ff72bccd0e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1c6970fc-9975-4ceb-816f-5ff72bccd0e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '605992911e554da59a4c01e1d1a47882', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:23:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:57.350 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 86acf75e-23c0-4967-be47-12ae34f9382b in datapath 1c6970fc-9975-4ceb-816f-5ff72bccd0e3 updated
Jan 27 22:23:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:57.351 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1c6970fc-9975-4ceb-816f-5ff72bccd0e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:23:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:23:57.352 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[883384b0-96e6-457e-a24a-13322ee4dda9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.887 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.889 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.931 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.932 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5861MB free_disk=73.14250183105469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.932 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:23:57 compute-1 nova_compute[183751]: 2026-01-27 22:23:57.932 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:23:58 compute-1 nova_compute[183751]: 2026-01-27 22:23:58.991 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:23:58 compute-1 nova_compute[183751]: 2026-01-27 22:23:58.992 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:23:57 up  2:26,  0 user,  load average: 0.11, 0.17, 0.09\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:23:59 compute-1 nova_compute[183751]: 2026-01-27 22:23:59.229 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:23:59 compute-1 nova_compute[183751]: 2026-01-27 22:23:59.736 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:24:00 compute-1 nova_compute[183751]: 2026-01-27 22:24:00.247 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:24:00 compute-1 nova_compute[183751]: 2026-01-27 22:24:00.247 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.315s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:24:00 compute-1 podman[216440]: 2026-01-27 22:24:00.861807153 +0000 UTC m=+0.159103319 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 22:24:00 compute-1 nova_compute[183751]: 2026-01-27 22:24:00.904 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:01 compute-1 nova_compute[183751]: 2026-01-27 22:24:01.248 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:01 compute-1 nova_compute[183751]: 2026-01-27 22:24:01.249 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:01 compute-1 nova_compute[183751]: 2026-01-27 22:24:01.249 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:01 compute-1 nova_compute[183751]: 2026-01-27 22:24:01.249 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:24:02 compute-1 nova_compute[183751]: 2026-01-27 22:24:02.258 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:05 compute-1 podman[193064]: time="2026-01-27T22:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:24:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:24:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Jan 27 22:24:05 compute-1 podman[216466]: 2026-01-27 22:24:05.781415856 +0000 UTC m=+0.075955411 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:24:05 compute-1 podman[216467]: 2026-01-27 22:24:05.781398836 +0000 UTC m=+0.069114412 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:24:05 compute-1 nova_compute[183751]: 2026-01-27 22:24:05.908 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:07 compute-1 nova_compute[183751]: 2026-01-27 22:24:07.260 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:07 compute-1 sshd-session[216509]: Invalid user lighthouse from 80.94.92.186 port 57332
Jan 27 22:24:07 compute-1 ovn_controller[95969]: 2026-01-27T22:24:07Z|00057|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 27 22:24:08 compute-1 sshd-session[216509]: Connection closed by invalid user lighthouse 80.94.92.186 port 57332 [preauth]
Jan 27 22:24:10 compute-1 nova_compute[183751]: 2026-01-27 22:24:10.909 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:11.266 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:24:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:11.267 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:24:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:11.267 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:24:12 compute-1 nova_compute[183751]: 2026-01-27 22:24:12.262 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:14 compute-1 podman[216512]: 2026-01-27 22:24:14.770800729 +0000 UTC m=+0.078125765 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:24:15 compute-1 nova_compute[183751]: 2026-01-27 22:24:15.911 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:17 compute-1 nova_compute[183751]: 2026-01-27 22:24:17.264 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:19 compute-1 openstack_network_exporter[195945]: ERROR   22:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:24:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:24:19 compute-1 openstack_network_exporter[195945]: ERROR   22:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:24:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:24:20 compute-1 nova_compute[183751]: 2026-01-27 22:24:20.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:20 compute-1 nova_compute[183751]: 2026-01-27 22:24:20.913 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:22 compute-1 nova_compute[183751]: 2026-01-27 22:24:22.266 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:24.508 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:a2:69 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a965ed27f6ee4c02a2538e87cb3ecdeb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f757ff-7e1b-47d8-a25a-9e1dab5d0324, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b2c930e7-5341-4fb1-8625-e406ed44ab80) old=Port_Binding(mac=['fa:16:3e:2e:a2:69'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a965ed27f6ee4c02a2538e87cb3ecdeb', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:24:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:24.510 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b2c930e7-5341-4fb1-8625-e406ed44ab80 in datapath da4007d0-b29b-4778-a286-b5dd1155cf44 updated
Jan 27 22:24:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:24.511 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network da4007d0-b29b-4778-a286-b5dd1155cf44, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:24:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:24.512 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[314827b0-5355-436e-9617-01f9b3e28e22]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:24:25 compute-1 nova_compute[183751]: 2026-01-27 22:24:25.915 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:27 compute-1 nova_compute[183751]: 2026-01-27 22:24:27.269 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:30 compute-1 nova_compute[183751]: 2026-01-27 22:24:30.916 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:31 compute-1 podman[216536]: 2026-01-27 22:24:31.837149154 +0000 UTC m=+0.136317155 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 22:24:32 compute-1 nova_compute[183751]: 2026-01-27 22:24:32.271 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:35.006 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:eb:34 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3a76e8b1-5851-4f66-963b-4a9cba8a05bc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a76e8b1-5851-4f66-963b-4a9cba8a05bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a4dc4388a0f4e9eb9abef43e0bc8df1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=682e4900-188c-463c-af44-4f9068fd88e5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6174e6b8-890f-4989-80f4-f35e2774de79) old=Port_Binding(mac=['fa:16:3e:be:eb:34'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3a76e8b1-5851-4f66-963b-4a9cba8a05bc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a76e8b1-5851-4f66-963b-4a9cba8a05bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a4dc4388a0f4e9eb9abef43e0bc8df1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:24:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:35.007 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6174e6b8-890f-4989-80f4-f35e2774de79 in datapath 3a76e8b1-5851-4f66-963b-4a9cba8a05bc updated
Jan 27 22:24:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:35.008 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a76e8b1-5851-4f66-963b-4a9cba8a05bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:24:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:35.008 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a69a2c6a-0638-4e19-afdb-be0890bafe5d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:24:35 compute-1 podman[193064]: time="2026-01-27T22:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:24:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:24:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Jan 27 22:24:35 compute-1 nova_compute[183751]: 2026-01-27 22:24:35.919 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:36 compute-1 podman[216565]: 2026-01-27 22:24:36.777950792 +0000 UTC m=+0.073458149 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 22:24:36 compute-1 podman[216564]: 2026-01-27 22:24:36.796345108 +0000 UTC m=+0.096815308 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 27 22:24:37 compute-1 nova_compute[183751]: 2026-01-27 22:24:37.273 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:40 compute-1 nova_compute[183751]: 2026-01-27 22:24:40.921 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:42 compute-1 nova_compute[183751]: 2026-01-27 22:24:42.276 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:45 compute-1 podman[216604]: 2026-01-27 22:24:45.762384871 +0000 UTC m=+0.066915718 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:24:45 compute-1 nova_compute[183751]: 2026-01-27 22:24:45.925 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:47 compute-1 nova_compute[183751]: 2026-01-27 22:24:47.278 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:49 compute-1 openstack_network_exporter[195945]: ERROR   22:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:24:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:24:49 compute-1 openstack_network_exporter[195945]: ERROR   22:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:24:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:24:50 compute-1 nova_compute[183751]: 2026-01-27 22:24:50.926 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:52 compute-1 nova_compute[183751]: 2026-01-27 22:24:52.280 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:54 compute-1 nova_compute[183751]: 2026-01-27 22:24:54.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:55 compute-1 nova_compute[183751]: 2026-01-27 22:24:55.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:55 compute-1 nova_compute[183751]: 2026-01-27 22:24:55.929 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.282 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.659 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.660 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.660 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.660 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.891 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.893 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.925 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.927 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.14249801635742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.927 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:24:57 compute-1 nova_compute[183751]: 2026-01-27 22:24:57.928 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:24:58 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:58.666 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:24:58 compute-1 nova_compute[183751]: 2026-01-27 22:24:58.667 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:24:58 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:24:58.668 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:24:59 compute-1 nova_compute[183751]: 2026-01-27 22:24:59.046 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:24:59 compute-1 nova_compute[183751]: 2026-01-27 22:24:59.047 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:24:57 up  2:27,  0 user,  load average: 0.04, 0.14, 0.08\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:24:59 compute-1 nova_compute[183751]: 2026-01-27 22:24:59.078 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:24:59 compute-1 nova_compute[183751]: 2026-01-27 22:24:59.587 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:25:00 compute-1 nova_compute[183751]: 2026-01-27 22:25:00.099 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:25:00 compute-1 nova_compute[183751]: 2026-01-27 22:25:00.100 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.171s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:00 compute-1 nova_compute[183751]: 2026-01-27 22:25:00.930 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:01 compute-1 nova_compute[183751]: 2026-01-27 22:25:01.101 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:01 compute-1 nova_compute[183751]: 2026-01-27 22:25:01.102 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:01 compute-1 nova_compute[183751]: 2026-01-27 22:25:01.102 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:02 compute-1 nova_compute[183751]: 2026-01-27 22:25:02.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:02 compute-1 nova_compute[183751]: 2026-01-27 22:25:02.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:25:02 compute-1 nova_compute[183751]: 2026-01-27 22:25:02.284 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:02 compute-1 podman[216630]: 2026-01-27 22:25:02.881510334 +0000 UTC m=+0.172943641 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Jan 27 22:25:05 compute-1 podman[193064]: time="2026-01-27T22:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:25:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:25:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 27 22:25:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:05.670 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:05 compute-1 nova_compute[183751]: 2026-01-27 22:25:05.841 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:05 compute-1 nova_compute[183751]: 2026-01-27 22:25:05.842 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:05 compute-1 nova_compute[183751]: 2026-01-27 22:25:05.933 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:06 compute-1 nova_compute[183751]: 2026-01-27 22:25:06.351 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:25:06 compute-1 nova_compute[183751]: 2026-01-27 22:25:06.925 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:06 compute-1 nova_compute[183751]: 2026-01-27 22:25:06.925 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:06 compute-1 nova_compute[183751]: 2026-01-27 22:25:06.931 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:25:06 compute-1 nova_compute[183751]: 2026-01-27 22:25:06.932 183755 INFO nova.compute.claims [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:25:07 compute-1 nova_compute[183751]: 2026-01-27 22:25:07.287 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:07 compute-1 podman[216657]: 2026-01-27 22:25:07.791771536 +0000 UTC m=+0.096189352 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Jan 27 22:25:07 compute-1 podman[216656]: 2026-01-27 22:25:07.796653546 +0000 UTC m=+0.099425122 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:25:08 compute-1 nova_compute[183751]: 2026-01-27 22:25:08.025 183755 DEBUG nova.compute.provider_tree [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:25:08 compute-1 nova_compute[183751]: 2026-01-27 22:25:08.534 183755 DEBUG nova.scheduler.client.report [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:25:09 compute-1 nova_compute[183751]: 2026-01-27 22:25:09.047 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:09 compute-1 nova_compute[183751]: 2026-01-27 22:25:09.048 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:25:09 compute-1 nova_compute[183751]: 2026-01-27 22:25:09.564 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:25:09 compute-1 nova_compute[183751]: 2026-01-27 22:25:09.565 183755 DEBUG nova.network.neutron [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:25:09 compute-1 nova_compute[183751]: 2026-01-27 22:25:09.567 183755 WARNING neutronclient.v2_0.client [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:25:09 compute-1 nova_compute[183751]: 2026-01-27 22:25:09.568 183755 WARNING neutronclient.v2_0.client [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:25:10 compute-1 nova_compute[183751]: 2026-01-27 22:25:10.084 183755 INFO nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:25:10 compute-1 nova_compute[183751]: 2026-01-27 22:25:10.595 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:25:10 compute-1 nova_compute[183751]: 2026-01-27 22:25:10.934 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:11.268 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:11.268 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:11.269 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.506 183755 DEBUG nova.network.neutron [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Successfully created port: a5d15d73-4376-43e7-b38b-d19871d5a694 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.619 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.623 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.623 183755 INFO nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Creating image(s)
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.624 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "/var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.625 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "/var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.626 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "/var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.628 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.635 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.639 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.729 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.730 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.730 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.732 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.735 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.736 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.800 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.801 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.847 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.849 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.850 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.935 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.936 183755 DEBUG nova.virt.disk.api [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Checking if we can resize image /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:25:11 compute-1 nova_compute[183751]: 2026-01-27 22:25:11.937 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.023 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.024 183755 DEBUG nova.virt.disk.api [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Cannot resize image /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.025 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.025 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Ensure instance console log exists: /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.026 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.026 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.026 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:12 compute-1 nova_compute[183751]: 2026-01-27 22:25:12.290 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:13 compute-1 nova_compute[183751]: 2026-01-27 22:25:13.463 183755 DEBUG nova.network.neutron [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Successfully updated port: a5d15d73-4376-43e7-b38b-d19871d5a694 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:25:13 compute-1 nova_compute[183751]: 2026-01-27 22:25:13.532 183755 DEBUG nova.compute.manager [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-changed-a5d15d73-4376-43e7-b38b-d19871d5a694 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:25:13 compute-1 nova_compute[183751]: 2026-01-27 22:25:13.532 183755 DEBUG nova.compute.manager [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Refreshing instance network info cache due to event network-changed-a5d15d73-4376-43e7-b38b-d19871d5a694. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:25:13 compute-1 nova_compute[183751]: 2026-01-27 22:25:13.533 183755 DEBUG oslo_concurrency.lockutils [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-0a9c96e3-896f-4d36-9676-a5a374c3482b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:25:13 compute-1 nova_compute[183751]: 2026-01-27 22:25:13.533 183755 DEBUG oslo_concurrency.lockutils [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-0a9c96e3-896f-4d36-9676-a5a374c3482b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:25:13 compute-1 nova_compute[183751]: 2026-01-27 22:25:13.534 183755 DEBUG nova.network.neutron [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Refreshing network info cache for port a5d15d73-4376-43e7-b38b-d19871d5a694 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:25:13 compute-1 nova_compute[183751]: 2026-01-27 22:25:13.972 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "refresh_cache-0a9c96e3-896f-4d36-9676-a5a374c3482b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:25:14 compute-1 nova_compute[183751]: 2026-01-27 22:25:14.040 183755 WARNING neutronclient.v2_0.client [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:25:14 compute-1 nova_compute[183751]: 2026-01-27 22:25:14.455 183755 DEBUG nova.network.neutron [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:25:14 compute-1 nova_compute[183751]: 2026-01-27 22:25:14.668 183755 DEBUG nova.network.neutron [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:25:15 compute-1 nova_compute[183751]: 2026-01-27 22:25:15.176 183755 DEBUG oslo_concurrency.lockutils [req-6ab47336-499c-44f6-9e65-17a538bbf31f req-df01fddf-5a3a-4dc6-bd78-0731fd899492 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-0a9c96e3-896f-4d36-9676-a5a374c3482b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:25:15 compute-1 nova_compute[183751]: 2026-01-27 22:25:15.177 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquired lock "refresh_cache-0a9c96e3-896f-4d36-9676-a5a374c3482b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:25:15 compute-1 nova_compute[183751]: 2026-01-27 22:25:15.178 183755 DEBUG nova.network.neutron [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:25:15 compute-1 nova_compute[183751]: 2026-01-27 22:25:15.799 183755 DEBUG nova.network.neutron [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:25:15 compute-1 nova_compute[183751]: 2026-01-27 22:25:15.936 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.056 183755 WARNING neutronclient.v2_0.client [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.445 183755 DEBUG nova.network.neutron [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Updating instance_info_cache with network_info: [{"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:25:16 compute-1 podman[216712]: 2026-01-27 22:25:16.771606211 +0000 UTC m=+0.068779574 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.956 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Releasing lock "refresh_cache-0a9c96e3-896f-4d36-9676-a5a374c3482b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.957 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Instance network_info: |[{"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.961 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Start _get_guest_xml network_info=[{"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.967 183755 WARNING nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.969 183755 DEBUG nova.virt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1022785074', uuid='0a9c96e3-896f-4d36-9676-a5a374c3482b'), owner=OwnerMeta(userid='8763102ab7304e1d9f53b063264a3607', username='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin', projectid='6a4dc4388a0f4e9eb9abef43e0bc8df1', projectname='tempest-TestExecuteHostMaintenanceStrategy-1384268311'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769552716.9697502) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.974 183755 DEBUG nova.virt.libvirt.host [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.975 183755 DEBUG nova.virt.libvirt.host [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.979 183755 DEBUG nova.virt.libvirt.host [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.980 183755 DEBUG nova.virt.libvirt.host [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.982 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.982 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.983 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.983 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.984 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.984 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.984 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.985 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.985 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.986 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.986 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.986 183755 DEBUG nova.virt.hardware [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.993 183755 DEBUG nova.virt.libvirt.vif [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1022785074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1022785074',id=6,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a4dc4388a0f4e9eb9abef43e0bc8df1',ramdisk_id='',reservation_id='r-yc6iod8n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:25:10Z,user_data=None,user_id='8763102ab7304e1d9f53b063264a3607',uuid=0a9c96e3-896f-4d36-9676-a5a374c3482b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.994 183755 DEBUG nova.network.os_vif_util [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converting VIF {"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.995 183755 DEBUG nova.network.os_vif_util [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:c7:2f,bridge_name='br-int',has_traffic_filtering=True,id=a5d15d73-4376-43e7-b38b-d19871d5a694,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5d15d73-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:25:16 compute-1 nova_compute[183751]: 2026-01-27 22:25:16.997 183755 DEBUG nova.objects.instance [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a9c96e3-896f-4d36-9676-a5a374c3482b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.293 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.506 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <uuid>0a9c96e3-896f-4d36-9676-a5a374c3482b</uuid>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <name>instance-00000006</name>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1022785074</nova:name>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:25:16</nova:creationTime>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:25:17 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:25:17 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:user uuid="8763102ab7304e1d9f53b063264a3607">tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin</nova:user>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:project uuid="6a4dc4388a0f4e9eb9abef43e0bc8df1">tempest-TestExecuteHostMaintenanceStrategy-1384268311</nova:project>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         <nova:port uuid="a5d15d73-4376-43e7-b38b-d19871d5a694">
Jan 27 22:25:17 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <system>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <entry name="serial">0a9c96e3-896f-4d36-9676-a5a374c3482b</entry>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <entry name="uuid">0a9c96e3-896f-4d36-9676-a5a374c3482b</entry>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </system>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <os>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   </os>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <features>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   </features>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk.config"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:c6:c7:2f"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <target dev="tapa5d15d73-43"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/console.log" append="off"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <video>
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </video>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:25:17 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:25:17 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:25:17 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:25:17 compute-1 nova_compute[183751]: </domain>
Jan 27 22:25:17 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.507 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Preparing to wait for external event network-vif-plugged-a5d15d73-4376-43e7-b38b-d19871d5a694 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.508 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.508 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.508 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.509 183755 DEBUG nova.virt.libvirt.vif [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1022785074',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1022785074',id=6,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a4dc4388a0f4e9eb9abef43e0bc8df1',ramdisk_id='',reservation_id='r-yc6iod8n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:25:10Z,user_data=None,user_id='8763102ab7304e1d9f53b063264a3607',uuid=0a9c96e3-896f-4d36-9676-a5a374c3482b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.510 183755 DEBUG nova.network.os_vif_util [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converting VIF {"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.511 183755 DEBUG nova.network.os_vif_util [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:c7:2f,bridge_name='br-int',has_traffic_filtering=True,id=a5d15d73-4376-43e7-b38b-d19871d5a694,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5d15d73-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.511 183755 DEBUG os_vif [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:c7:2f,bridge_name='br-int',has_traffic_filtering=True,id=a5d15d73-4376-43e7-b38b-d19871d5a694,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5d15d73-43') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.512 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.513 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.513 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.514 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.515 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3981194e-0023-5f5d-b94c-4c3d2165fd99', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.517 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.519 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.524 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.524 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5d15d73-43, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.525 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa5d15d73-43, col_values=(('qos', UUID('cc18fc53-3108-4e04-9838-35c6a0e46cd0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.525 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa5d15d73-43, col_values=(('external_ids', {'iface-id': 'a5d15d73-4376-43e7-b38b-d19871d5a694', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:c7:2f', 'vm-uuid': '0a9c96e3-896f-4d36-9676-a5a374c3482b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.528 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 NetworkManager[56069]: <info>  [1769552717.5299] manager: (tapa5d15d73-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.530 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.536 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:17 compute-1 nova_compute[183751]: 2026-01-27 22:25:17.537 183755 INFO os_vif [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:c7:2f,bridge_name='br-int',has_traffic_filtering=True,id=a5d15d73-4376-43e7-b38b-d19871d5a694,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5d15d73-43')
Jan 27 22:25:19 compute-1 nova_compute[183751]: 2026-01-27 22:25:19.086 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:25:19 compute-1 nova_compute[183751]: 2026-01-27 22:25:19.087 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:25:19 compute-1 nova_compute[183751]: 2026-01-27 22:25:19.087 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] No VIF found with MAC fa:16:3e:c6:c7:2f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:25:19 compute-1 nova_compute[183751]: 2026-01-27 22:25:19.088 183755 INFO nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Using config drive
Jan 27 22:25:19 compute-1 openstack_network_exporter[195945]: ERROR   22:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:25:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:25:19 compute-1 openstack_network_exporter[195945]: ERROR   22:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:25:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:25:19 compute-1 nova_compute[183751]: 2026-01-27 22:25:19.599 183755 WARNING neutronclient.v2_0.client [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:25:20 compute-1 nova_compute[183751]: 2026-01-27 22:25:20.939 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:21 compute-1 nova_compute[183751]: 2026-01-27 22:25:21.437 183755 INFO nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Creating config drive at /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk.config
Jan 27 22:25:21 compute-1 nova_compute[183751]: 2026-01-27 22:25:21.444 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp8cjdnltc execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:25:21 compute-1 nova_compute[183751]: 2026-01-27 22:25:21.586 183755 DEBUG oslo_concurrency.processutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp8cjdnltc" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:25:21 compute-1 kernel: tapa5d15d73-43: entered promiscuous mode
Jan 27 22:25:21 compute-1 NetworkManager[56069]: <info>  [1769552721.6738] manager: (tapa5d15d73-43): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Jan 27 22:25:21 compute-1 nova_compute[183751]: 2026-01-27 22:25:21.673 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:21 compute-1 ovn_controller[95969]: 2026-01-27T22:25:21Z|00058|binding|INFO|Claiming lport a5d15d73-4376-43e7-b38b-d19871d5a694 for this chassis.
Jan 27 22:25:21 compute-1 ovn_controller[95969]: 2026-01-27T22:25:21Z|00059|binding|INFO|a5d15d73-4376-43e7-b38b-d19871d5a694: Claiming fa:16:3e:c6:c7:2f 10.100.0.13
Jan 27 22:25:21 compute-1 nova_compute[183751]: 2026-01-27 22:25:21.677 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.693 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:c7:2f 10.100.0.13'], port_security=['fa:16:3e:c6:c7:2f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0a9c96e3-896f-4d36-9676-a5a374c3482b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a4dc4388a0f4e9eb9abef43e0bc8df1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c5db407e-e957-436c-a734-22d98b76b6e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f757ff-7e1b-47d8-a25a-9e1dab5d0324, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=a5d15d73-4376-43e7-b38b-d19871d5a694) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.696 105247 INFO neutron.agent.ovn.metadata.agent [-] Port a5d15d73-4376-43e7-b38b-d19871d5a694 in datapath da4007d0-b29b-4778-a286-b5dd1155cf44 bound to our chassis
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.697 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da4007d0-b29b-4778-a286-b5dd1155cf44
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.715 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[1d753718-e84d-4b86-ba8a-af206dcf08b9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.716 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapda4007d0-b1 in ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.717 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapda4007d0-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.717 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6b7976-db25-478e-bc5d-f3c516224428]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.718 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6c07e414-1945-4c1d-88ed-723c3b51b2f7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 systemd-udevd[216756]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.731 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[affd2da6-ec96-463e-8934-3e47239f291f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 systemd-machined[155034]: New machine qemu-3-instance-00000006.
Jan 27 22:25:21 compute-1 nova_compute[183751]: 2026-01-27 22:25:21.739 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:21 compute-1 NetworkManager[56069]: <info>  [1769552721.7441] device (tapa5d15d73-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:25:21 compute-1 NetworkManager[56069]: <info>  [1769552721.7454] device (tapa5d15d73-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:25:21 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Jan 27 22:25:21 compute-1 ovn_controller[95969]: 2026-01-27T22:25:21Z|00060|binding|INFO|Setting lport a5d15d73-4376-43e7-b38b-d19871d5a694 ovn-installed in OVS
Jan 27 22:25:21 compute-1 ovn_controller[95969]: 2026-01-27T22:25:21Z|00061|binding|INFO|Setting lport a5d15d73-4376-43e7-b38b-d19871d5a694 up in Southbound
Jan 27 22:25:21 compute-1 nova_compute[183751]: 2026-01-27 22:25:21.750 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.752 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[cddd8ce6-0d49-497c-a801-a057b1106021]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.791 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[10c94643-1f07-46d5-8ef3-59e06e094a0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.796 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3b0c26d3-2560-4c7b-afba-3f27a63b0a63]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 NetworkManager[56069]: <info>  [1769552721.7983] manager: (tapda4007d0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/30)
Jan 27 22:25:21 compute-1 systemd-udevd[216761]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.837 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[1b682986-9ef6-4ded-9a5c-e6534630fbd9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.840 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8f2636-2f25-4da5-bea2-92358031583c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 NetworkManager[56069]: <info>  [1769552721.8712] device (tapda4007d0-b0): carrier: link connected
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.877 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[19f6a2ec-2732-45b4-ae41-5019d291e402]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.896 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[19134250-9a80-4d4f-9d3d-dbe65dedeff3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda4007d0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:a2:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 886413, 'reachable_time': 31784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216789, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.918 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a9b15d-f8e5-4bfa-b11c-32d7c73ad821]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:a269'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 886413, 'tstamp': 886413}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216790, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.938 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[914ebb10-9332-4cc5-8ce3-43739ac6253a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda4007d0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:a2:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 886413, 'reachable_time': 31784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216791, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:21.971 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[fff170d0-e8fd-4b28-96fd-e117751f4cda]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.042 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[60720957-46bc-4c3f-a771-38242800b7dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.043 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda4007d0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.044 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.044 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda4007d0-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:22 compute-1 NetworkManager[56069]: <info>  [1769552722.0469] manager: (tapda4007d0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 27 22:25:22 compute-1 kernel: tapda4007d0-b0: entered promiscuous mode
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.046 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.049 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda4007d0-b0, col_values=(('external_ids', {'iface-id': 'b2c930e7-5341-4fb1-8625-e406ed44ab80'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.050 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:22 compute-1 ovn_controller[95969]: 2026-01-27T22:25:22Z|00062|binding|INFO|Releasing lport b2c930e7-5341-4fb1-8625-e406ed44ab80 from this chassis (sb_readonly=0)
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.072 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.074 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[598dc081-abc2-45af-a770-e22cdff602df]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.075 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.075 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.075 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for da4007d0-b29b-4778-a286-b5dd1155cf44 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.076 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.076 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f533629a-601c-41c6-a4a2-896e130e751b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.077 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.077 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[39e7a0b8-5ed9-4770-80df-644fe43eb7e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.078 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-da4007d0-b29b-4778-a286-b5dd1155cf44
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID da4007d0-b29b-4778-a286-b5dd1155cf44
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:25:22 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:22.078 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'env', 'PROCESS_TAG=haproxy-da4007d0-b29b-4778-a286-b5dd1155cf44', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/da4007d0-b29b-4778-a286-b5dd1155cf44.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.177 183755 DEBUG nova.compute.manager [req-7876b47b-a014-4ae9-827c-0e77f42ad9d3 req-643e50ae-5708-472f-b26b-0cf928355222 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-vif-plugged-a5d15d73-4376-43e7-b38b-d19871d5a694 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.177 183755 DEBUG oslo_concurrency.lockutils [req-7876b47b-a014-4ae9-827c-0e77f42ad9d3 req-643e50ae-5708-472f-b26b-0cf928355222 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.178 183755 DEBUG oslo_concurrency.lockutils [req-7876b47b-a014-4ae9-827c-0e77f42ad9d3 req-643e50ae-5708-472f-b26b-0cf928355222 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.178 183755 DEBUG oslo_concurrency.lockutils [req-7876b47b-a014-4ae9-827c-0e77f42ad9d3 req-643e50ae-5708-472f-b26b-0cf928355222 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.178 183755 DEBUG nova.compute.manager [req-7876b47b-a014-4ae9-827c-0e77f42ad9d3 req-643e50ae-5708-472f-b26b-0cf928355222 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Processing event network-vif-plugged-a5d15d73-4376-43e7-b38b-d19871d5a694 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.358 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.367 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.371 183755 INFO nova.virt.libvirt.driver [-] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Instance spawned successfully.
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.371 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:25:22 compute-1 podman[216827]: 2026-01-27 22:25:22.513365554 +0000 UTC m=+0.074847734 container create c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126, tcib_managed=true)
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.528 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:22 compute-1 podman[216827]: 2026-01-27 22:25:22.474110812 +0000 UTC m=+0.035593072 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:25:22 compute-1 systemd[1]: Started libpod-conmon-c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595.scope.
Jan 27 22:25:22 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:25:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/703c44603838968005316f959989e9826ba925dc32a3fc2d2798ab054d8edf50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:25:22 compute-1 podman[216827]: 2026-01-27 22:25:22.618453285 +0000 UTC m=+0.179935475 container init c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 27 22:25:22 compute-1 podman[216827]: 2026-01-27 22:25:22.62671764 +0000 UTC m=+0.188199830 container start c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:25:22 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[216842]: [NOTICE]   (216846) : New worker (216848) forked
Jan 27 22:25:22 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[216842]: [NOTICE]   (216846) : Loading success.
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.886 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.887 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.888 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.889 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.890 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:25:22 compute-1 nova_compute[183751]: 2026-01-27 22:25:22.890 183755 DEBUG nova.virt.libvirt.driver [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:25:23 compute-1 nova_compute[183751]: 2026-01-27 22:25:23.400 183755 INFO nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Took 11.78 seconds to spawn the instance on the hypervisor.
Jan 27 22:25:23 compute-1 nova_compute[183751]: 2026-01-27 22:25:23.401 183755 DEBUG nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:25:23 compute-1 nova_compute[183751]: 2026-01-27 22:25:23.958 183755 INFO nova.compute.manager [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Took 17.09 seconds to build instance.
Jan 27 22:25:24 compute-1 nova_compute[183751]: 2026-01-27 22:25:24.467 183755 DEBUG oslo_concurrency.lockutils [None req-4f776f2f-af45-4357-945e-563a36ffb596 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.625s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:24 compute-1 nova_compute[183751]: 2026-01-27 22:25:24.562 183755 DEBUG nova.compute.manager [req-6dc4791b-ceba-46e9-b044-c917b3136d19 req-c31a6354-5fbd-4e5b-b234-98a7e534b802 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-vif-plugged-a5d15d73-4376-43e7-b38b-d19871d5a694 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:25:24 compute-1 nova_compute[183751]: 2026-01-27 22:25:24.563 183755 DEBUG oslo_concurrency.lockutils [req-6dc4791b-ceba-46e9-b044-c917b3136d19 req-c31a6354-5fbd-4e5b-b234-98a7e534b802 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:24 compute-1 nova_compute[183751]: 2026-01-27 22:25:24.564 183755 DEBUG oslo_concurrency.lockutils [req-6dc4791b-ceba-46e9-b044-c917b3136d19 req-c31a6354-5fbd-4e5b-b234-98a7e534b802 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:24 compute-1 nova_compute[183751]: 2026-01-27 22:25:24.564 183755 DEBUG oslo_concurrency.lockutils [req-6dc4791b-ceba-46e9-b044-c917b3136d19 req-c31a6354-5fbd-4e5b-b234-98a7e534b802 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:24 compute-1 nova_compute[183751]: 2026-01-27 22:25:24.565 183755 DEBUG nova.compute.manager [req-6dc4791b-ceba-46e9-b044-c917b3136d19 req-c31a6354-5fbd-4e5b-b234-98a7e534b802 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] No waiting events found dispatching network-vif-plugged-a5d15d73-4376-43e7-b38b-d19871d5a694 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:25:24 compute-1 nova_compute[183751]: 2026-01-27 22:25:24.565 183755 WARNING nova.compute.manager [req-6dc4791b-ceba-46e9-b044-c917b3136d19 req-c31a6354-5fbd-4e5b-b234-98a7e534b802 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received unexpected event network-vif-plugged-a5d15d73-4376-43e7-b38b-d19871d5a694 for instance with vm_state active and task_state None.
Jan 27 22:25:25 compute-1 nova_compute[183751]: 2026-01-27 22:25:25.943 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:27 compute-1 nova_compute[183751]: 2026-01-27 22:25:27.531 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:28 compute-1 nova_compute[183751]: 2026-01-27 22:25:28.928 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:28 compute-1 nova_compute[183751]: 2026-01-27 22:25:28.930 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:28 compute-1 nova_compute[183751]: 2026-01-27 22:25:28.930 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:28 compute-1 nova_compute[183751]: 2026-01-27 22:25:28.931 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:28 compute-1 nova_compute[183751]: 2026-01-27 22:25:28.932 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:28 compute-1 nova_compute[183751]: 2026-01-27 22:25:28.945 183755 INFO nova.compute.manager [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Terminating instance
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.464 183755 DEBUG nova.compute.manager [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:25:29 compute-1 kernel: tapa5d15d73-43 (unregistering): left promiscuous mode
Jan 27 22:25:29 compute-1 NetworkManager[56069]: <info>  [1769552729.4885] device (tapa5d15d73-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:25:29 compute-1 ovn_controller[95969]: 2026-01-27T22:25:29Z|00063|binding|INFO|Releasing lport a5d15d73-4376-43e7-b38b-d19871d5a694 from this chassis (sb_readonly=0)
Jan 27 22:25:29 compute-1 ovn_controller[95969]: 2026-01-27T22:25:29Z|00064|binding|INFO|Setting lport a5d15d73-4376-43e7-b38b-d19871d5a694 down in Southbound
Jan 27 22:25:29 compute-1 ovn_controller[95969]: 2026-01-27T22:25:29Z|00065|binding|INFO|Removing iface tapa5d15d73-43 ovn-installed in OVS
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.513 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.517 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.531 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:c7:2f 10.100.0.13'], port_security=['fa:16:3e:c6:c7:2f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0a9c96e3-896f-4d36-9676-a5a374c3482b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a4dc4388a0f4e9eb9abef43e0bc8df1', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c5db407e-e957-436c-a734-22d98b76b6e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f757ff-7e1b-47d8-a25a-9e1dab5d0324, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=a5d15d73-4376-43e7-b38b-d19871d5a694) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.532 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.532 105247 INFO neutron.agent.ovn.metadata.agent [-] Port a5d15d73-4376-43e7-b38b-d19871d5a694 in datapath da4007d0-b29b-4778-a286-b5dd1155cf44 unbound from our chassis
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.534 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network da4007d0-b29b-4778-a286-b5dd1155cf44, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.535 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1c05a5-7a01-4db8-8ba0-a8cca3af260f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.535 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 namespace which is not needed anymore
Jan 27 22:25:29 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 27 22:25:29 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 7.851s CPU time.
Jan 27 22:25:29 compute-1 systemd-machined[155034]: Machine qemu-3-instance-00000006 terminated.
Jan 27 22:25:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[216842]: [NOTICE]   (216846) : haproxy version is 3.0.5-8e879a5
Jan 27 22:25:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[216842]: [NOTICE]   (216846) : path to executable is /usr/sbin/haproxy
Jan 27 22:25:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[216842]: [WARNING]  (216846) : Exiting Master process...
Jan 27 22:25:29 compute-1 podman[216881]: 2026-01-27 22:25:29.694401152 +0000 UTC m=+0.039901508 container kill c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, tcib_managed=true, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 22:25:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[216842]: [ALERT]    (216846) : Current worker (216848) exited with code 143 (Terminated)
Jan 27 22:25:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[216842]: [WARNING]  (216846) : All workers exited. Exiting... (0)
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.696 183755 DEBUG nova.compute.manager [req-e95ab69e-cb30-4848-a91a-966e75258031 req-1f8e13e9-cb2b-4a2a-a885-a8faa83f0add 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-vif-unplugged-a5d15d73-4376-43e7-b38b-d19871d5a694 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.696 183755 DEBUG oslo_concurrency.lockutils [req-e95ab69e-cb30-4848-a91a-966e75258031 req-1f8e13e9-cb2b-4a2a-a885-a8faa83f0add 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.697 183755 DEBUG oslo_concurrency.lockutils [req-e95ab69e-cb30-4848-a91a-966e75258031 req-1f8e13e9-cb2b-4a2a-a885-a8faa83f0add 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.698 183755 DEBUG oslo_concurrency.lockutils [req-e95ab69e-cb30-4848-a91a-966e75258031 req-1f8e13e9-cb2b-4a2a-a885-a8faa83f0add 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:29 compute-1 systemd[1]: libpod-c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595.scope: Deactivated successfully.
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.698 183755 DEBUG nova.compute.manager [req-e95ab69e-cb30-4848-a91a-966e75258031 req-1f8e13e9-cb2b-4a2a-a885-a8faa83f0add 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] No waiting events found dispatching network-vif-unplugged-a5d15d73-4376-43e7-b38b-d19871d5a694 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.699 183755 DEBUG nova.compute.manager [req-e95ab69e-cb30-4848-a91a-966e75258031 req-1f8e13e9-cb2b-4a2a-a885-a8faa83f0add 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-vif-unplugged-a5d15d73-4376-43e7-b38b-d19871d5a694 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.739 183755 INFO nova.virt.libvirt.driver [-] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Instance destroyed successfully.
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.740 183755 DEBUG nova.objects.instance [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lazy-loading 'resources' on Instance uuid 0a9c96e3-896f-4d36-9676-a5a374c3482b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:25:29 compute-1 podman[216900]: 2026-01-27 22:25:29.765530123 +0000 UTC m=+0.043292423 container died c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 22:25:29 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595-userdata-shm.mount: Deactivated successfully.
Jan 27 22:25:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-703c44603838968005316f959989e9826ba925dc32a3fc2d2798ab054d8edf50-merged.mount: Deactivated successfully.
Jan 27 22:25:29 compute-1 podman[216900]: 2026-01-27 22:25:29.814835503 +0000 UTC m=+0.092597833 container cleanup c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 22:25:29 compute-1 systemd[1]: libpod-conmon-c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595.scope: Deactivated successfully.
Jan 27 22:25:29 compute-1 podman[216909]: 2026-01-27 22:25:29.837986727 +0000 UTC m=+0.094612573 container remove c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126)
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.846 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[49beac43-8dbb-46d6-90a1-df946455f89a]: (4, ("Tue Jan 27 10:25:29 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 (c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595)\nc0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595\nTue Jan 27 10:25:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 (c0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595)\nc0a9dd460a99d2b02071fea7183bde758f88bf8b1593a88dcb2a693a90146595\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.847 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b8cd638e-ee71-4ccf-a866-f1c255f63f4a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.848 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.848 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[2e61bf89-1581-407a-baf6-fc784930ac92]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.849 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda4007d0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.852 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:29 compute-1 kernel: tapda4007d0-b0: left promiscuous mode
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.882 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:29 compute-1 nova_compute[183751]: 2026-01-27 22:25:29.885 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.885 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[666d90a0-619e-40d3-8051-06cb1a15595f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.905 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[02486d9f-47ec-4fe1-8b83-3d1c43a02456]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.906 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc6712e-6c30-42b5-98de-92ae16081cb2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.929 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4f321b-04c6-4d04-a64c-92f1e435bb8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 886404, 'reachable_time': 34123, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216946, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.933 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:25:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:25:29.933 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5b9cac-3554-454e-9323-5bd477226416]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:25:29 compute-1 systemd[1]: run-netns-ovnmeta\x2dda4007d0\x2db29b\x2d4778\x2da286\x2db5dd1155cf44.mount: Deactivated successfully.
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.247 183755 DEBUG nova.virt.libvirt.vif [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:25:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1022785074',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1022785074',id=6,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:25:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a4dc4388a0f4e9eb9abef43e0bc8df1',ramdisk_id='',reservation_id='r-yc6iod8n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:25:23Z,user_data=None,user_id='8763102ab7304e1d9f53b063264a3607',uuid=0a9c96e3-896f-4d36-9676-a5a374c3482b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.248 183755 DEBUG nova.network.os_vif_util [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converting VIF {"id": "a5d15d73-4376-43e7-b38b-d19871d5a694", "address": "fa:16:3e:c6:c7:2f", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5d15d73-43", "ovs_interfaceid": "a5d15d73-4376-43e7-b38b-d19871d5a694", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.249 183755 DEBUG nova.network.os_vif_util [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:c7:2f,bridge_name='br-int',has_traffic_filtering=True,id=a5d15d73-4376-43e7-b38b-d19871d5a694,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5d15d73-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.249 183755 DEBUG os_vif [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:c7:2f,bridge_name='br-int',has_traffic_filtering=True,id=a5d15d73-4376-43e7-b38b-d19871d5a694,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5d15d73-43') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.252 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.252 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5d15d73-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.254 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.256 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.257 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.257 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=cc18fc53-3108-4e04-9838-35c6a0e46cd0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.258 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.259 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.262 183755 INFO os_vif [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:c7:2f,bridge_name='br-int',has_traffic_filtering=True,id=a5d15d73-4376-43e7-b38b-d19871d5a694,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5d15d73-43')
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.263 183755 INFO nova.virt.libvirt.driver [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Deleting instance files /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b_del
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.264 183755 INFO nova.virt.libvirt.driver [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Deletion of /var/lib/nova/instances/0a9c96e3-896f-4d36-9676-a5a374c3482b_del complete
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.783 183755 INFO nova.compute.manager [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.784 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.784 183755 DEBUG nova.compute.manager [-] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.784 183755 DEBUG nova.network.neutron [-] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.785 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:25:30 compute-1 nova_compute[183751]: 2026-01-27 22:25:30.944 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:31 compute-1 nova_compute[183751]: 2026-01-27 22:25:31.465 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:25:31 compute-1 nova_compute[183751]: 2026-01-27 22:25:31.772 183755 DEBUG nova.compute.manager [req-6972c9d5-e8ae-4305-a04f-8036615b42a8 req-338a9f6e-7d7a-407e-a9c4-9f3c5ccd67e3 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-vif-unplugged-a5d15d73-4376-43e7-b38b-d19871d5a694 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:25:31 compute-1 nova_compute[183751]: 2026-01-27 22:25:31.772 183755 DEBUG oslo_concurrency.lockutils [req-6972c9d5-e8ae-4305-a04f-8036615b42a8 req-338a9f6e-7d7a-407e-a9c4-9f3c5ccd67e3 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:31 compute-1 nova_compute[183751]: 2026-01-27 22:25:31.773 183755 DEBUG oslo_concurrency.lockutils [req-6972c9d5-e8ae-4305-a04f-8036615b42a8 req-338a9f6e-7d7a-407e-a9c4-9f3c5ccd67e3 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:31 compute-1 nova_compute[183751]: 2026-01-27 22:25:31.773 183755 DEBUG oslo_concurrency.lockutils [req-6972c9d5-e8ae-4305-a04f-8036615b42a8 req-338a9f6e-7d7a-407e-a9c4-9f3c5ccd67e3 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:31 compute-1 nova_compute[183751]: 2026-01-27 22:25:31.774 183755 DEBUG nova.compute.manager [req-6972c9d5-e8ae-4305-a04f-8036615b42a8 req-338a9f6e-7d7a-407e-a9c4-9f3c5ccd67e3 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] No waiting events found dispatching network-vif-unplugged-a5d15d73-4376-43e7-b38b-d19871d5a694 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:25:31 compute-1 nova_compute[183751]: 2026-01-27 22:25:31.774 183755 DEBUG nova.compute.manager [req-6972c9d5-e8ae-4305-a04f-8036615b42a8 req-338a9f6e-7d7a-407e-a9c4-9f3c5ccd67e3 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-vif-unplugged-a5d15d73-4376-43e7-b38b-d19871d5a694 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:25:32 compute-1 nova_compute[183751]: 2026-01-27 22:25:32.330 183755 DEBUG nova.network.neutron [-] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:25:32 compute-1 nova_compute[183751]: 2026-01-27 22:25:32.838 183755 INFO nova.compute.manager [-] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Took 2.05 seconds to deallocate network for instance.
Jan 27 22:25:33 compute-1 nova_compute[183751]: 2026-01-27 22:25:33.365 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:33 compute-1 nova_compute[183751]: 2026-01-27 22:25:33.366 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:33 compute-1 nova_compute[183751]: 2026-01-27 22:25:33.439 183755 DEBUG nova.compute.provider_tree [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:25:33 compute-1 nova_compute[183751]: 2026-01-27 22:25:33.830 183755 DEBUG nova.compute.manager [req-c0ea0606-2f3b-4bde-b564-375d645596da req-6bd954ff-5f16-4e84-9455-2adf62e71261 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 0a9c96e3-896f-4d36-9676-a5a374c3482b] Received event network-vif-deleted-a5d15d73-4376-43e7-b38b-d19871d5a694 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:25:33 compute-1 podman[216948]: 2026-01-27 22:25:33.84264817 +0000 UTC m=+0.144141349 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS)
Jan 27 22:25:33 compute-1 nova_compute[183751]: 2026-01-27 22:25:33.948 183755 DEBUG nova.scheduler.client.report [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:25:34 compute-1 nova_compute[183751]: 2026-01-27 22:25:34.459 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:34 compute-1 nova_compute[183751]: 2026-01-27 22:25:34.506 183755 INFO nova.scheduler.client.report [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Deleted allocations for instance 0a9c96e3-896f-4d36-9676-a5a374c3482b
Jan 27 22:25:35 compute-1 nova_compute[183751]: 2026-01-27 22:25:35.259 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:35 compute-1 nova_compute[183751]: 2026-01-27 22:25:35.550 183755 DEBUG oslo_concurrency.lockutils [None req-7d4fbe8d-a0b9-430b-afad-da689e3b518c 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "0a9c96e3-896f-4d36-9676-a5a374c3482b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.620s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:35 compute-1 podman[193064]: time="2026-01-27T22:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:25:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:25:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 27 22:25:35 compute-1 nova_compute[183751]: 2026-01-27 22:25:35.945 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:38 compute-1 podman[216977]: 2026-01-27 22:25:38.793453534 +0000 UTC m=+0.085828325 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 27 22:25:38 compute-1 podman[216976]: 2026-01-27 22:25:38.825466497 +0000 UTC m=+0.125655152 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Jan 27 22:25:40 compute-1 nova_compute[183751]: 2026-01-27 22:25:40.262 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:40 compute-1 nova_compute[183751]: 2026-01-27 22:25:40.947 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:45 compute-1 nova_compute[183751]: 2026-01-27 22:25:45.265 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:45 compute-1 nova_compute[183751]: 2026-01-27 22:25:45.950 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:47 compute-1 podman[217017]: 2026-01-27 22:25:47.780383585 +0000 UTC m=+0.084177844 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:25:49 compute-1 openstack_network_exporter[195945]: ERROR   22:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:25:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:25:49 compute-1 openstack_network_exporter[195945]: ERROR   22:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:25:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:25:50 compute-1 nova_compute[183751]: 2026-01-27 22:25:50.267 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:50 compute-1 nova_compute[183751]: 2026-01-27 22:25:50.982 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:55 compute-1 nova_compute[183751]: 2026-01-27 22:25:55.270 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:55 compute-1 nova_compute[183751]: 2026-01-27 22:25:55.984 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:25:56 compute-1 nova_compute[183751]: 2026-01-27 22:25:56.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:57 compute-1 nova_compute[183751]: 2026-01-27 22:25:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.665 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.851 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.854 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.891 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.893 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5844MB free_disk=73.14242553710938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.893 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:25:58 compute-1 nova_compute[183751]: 2026-01-27 22:25:58.894 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:25:59 compute-1 nova_compute[183751]: 2026-01-27 22:25:59.960 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:25:59 compute-1 nova_compute[183751]: 2026-01-27 22:25:59.961 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:25:58 up  2:28,  0 user,  load average: 0.27, 0.19, 0.10\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:25:59 compute-1 nova_compute[183751]: 2026-01-27 22:25:59.986 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.008 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.008 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.022 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.049 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.074 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.272 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.582 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:26:00 compute-1 nova_compute[183751]: 2026-01-27 22:26:00.987 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:01 compute-1 nova_compute[183751]: 2026-01-27 22:26:01.093 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:26:01 compute-1 nova_compute[183751]: 2026-01-27 22:26:01.094 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.200s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:03 compute-1 nova_compute[183751]: 2026-01-27 22:26:03.094 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:03 compute-1 nova_compute[183751]: 2026-01-27 22:26:03.095 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:03 compute-1 nova_compute[183751]: 2026-01-27 22:26:03.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:03 compute-1 nova_compute[183751]: 2026-01-27 22:26:03.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:26:03 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:03.356 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:26:03 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:03.357 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:26:03 compute-1 nova_compute[183751]: 2026-01-27 22:26:03.360 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:04 compute-1 podman[217044]: 2026-01-27 22:26:04.82328131 +0000 UTC m=+0.118002092 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 27 22:26:05 compute-1 nova_compute[183751]: 2026-01-27 22:26:05.307 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:05 compute-1 podman[193064]: time="2026-01-27T22:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:26:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:26:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Jan 27 22:26:05 compute-1 nova_compute[183751]: 2026-01-27 22:26:05.988 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:06 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:06.359 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:09 compute-1 podman[217071]: 2026-01-27 22:26:09.77220426 +0000 UTC m=+0.085250051 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 27 22:26:09 compute-1 podman[217072]: 2026-01-27 22:26:09.77622236 +0000 UTC m=+0.087610120 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 22:26:10 compute-1 nova_compute[183751]: 2026-01-27 22:26:10.309 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:10 compute-1 nova_compute[183751]: 2026-01-27 22:26:10.991 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:11 compute-1 nova_compute[183751]: 2026-01-27 22:26:11.120 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:11 compute-1 nova_compute[183751]: 2026-01-27 22:26:11.120 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:11.270 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:11.270 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:11.270 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:11 compute-1 nova_compute[183751]: 2026-01-27 22:26:11.626 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:26:12 compute-1 nova_compute[183751]: 2026-01-27 22:26:12.194 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:12 compute-1 nova_compute[183751]: 2026-01-27 22:26:12.195 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:12 compute-1 nova_compute[183751]: 2026-01-27 22:26:12.203 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:26:12 compute-1 nova_compute[183751]: 2026-01-27 22:26:12.203 183755 INFO nova.compute.claims [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:26:13 compute-1 nova_compute[183751]: 2026-01-27 22:26:13.275 183755 DEBUG nova.compute.provider_tree [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:26:13 compute-1 nova_compute[183751]: 2026-01-27 22:26:13.784 183755 DEBUG nova.scheduler.client.report [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:26:14 compute-1 nova_compute[183751]: 2026-01-27 22:26:14.297 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:14 compute-1 nova_compute[183751]: 2026-01-27 22:26:14.298 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:26:14 compute-1 nova_compute[183751]: 2026-01-27 22:26:14.813 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:26:14 compute-1 nova_compute[183751]: 2026-01-27 22:26:14.814 183755 DEBUG nova.network.neutron [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:26:14 compute-1 nova_compute[183751]: 2026-01-27 22:26:14.814 183755 WARNING neutronclient.v2_0.client [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:26:14 compute-1 nova_compute[183751]: 2026-01-27 22:26:14.814 183755 WARNING neutronclient.v2_0.client [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:26:15 compute-1 nova_compute[183751]: 2026-01-27 22:26:15.356 183755 INFO nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:26:15 compute-1 nova_compute[183751]: 2026-01-27 22:26:15.360 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:15 compute-1 nova_compute[183751]: 2026-01-27 22:26:15.866 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:26:15 compute-1 nova_compute[183751]: 2026-01-27 22:26:15.990 183755 DEBUG nova.network.neutron [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Successfully created port: 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:26:15 compute-1 nova_compute[183751]: 2026-01-27 22:26:15.993 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.773 183755 DEBUG nova.network.neutron [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Successfully updated port: 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.883 183755 DEBUG nova.compute.manager [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-changed-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.884 183755 DEBUG nova.compute.manager [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Refreshing instance network info cache due to event network-changed-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.884 183755 DEBUG oslo_concurrency.lockutils [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.885 183755 DEBUG oslo_concurrency.lockutils [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.885 183755 DEBUG nova.network.neutron [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Refreshing network info cache for port 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.887 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.889 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.890 183755 INFO nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Creating image(s)
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.891 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "/var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.891 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "/var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.892 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "/var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.893 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.898 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.901 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.996 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.998 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.998 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:16 compute-1 nova_compute[183751]: 2026-01-27 22:26:16.999 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.002 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.003 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.062 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.063 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.095 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.096 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.097 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.148 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.150 183755 DEBUG nova.virt.disk.api [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Checking if we can resize image /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.151 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.243 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.245 183755 DEBUG nova.virt.disk.api [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Cannot resize image /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.246 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.246 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Ensure instance console log exists: /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.246 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.247 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.247 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.281 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "refresh_cache-4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:26:17 compute-1 nova_compute[183751]: 2026-01-27 22:26:17.412 183755 WARNING neutronclient.v2_0.client [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:26:18 compute-1 nova_compute[183751]: 2026-01-27 22:26:18.076 183755 DEBUG nova.network.neutron [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:26:18 compute-1 nova_compute[183751]: 2026-01-27 22:26:18.293 183755 DEBUG nova.network.neutron [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:26:18 compute-1 podman[217125]: 2026-01-27 22:26:18.784839745 +0000 UTC m=+0.092466160 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:26:18 compute-1 nova_compute[183751]: 2026-01-27 22:26:18.800 183755 DEBUG oslo_concurrency.lockutils [req-fbb94d22-2101-46e5-8b64-e60a0b186fe1 req-345ea616-0b1c-4812-bd7b-acfc05f1a9eb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:26:18 compute-1 nova_compute[183751]: 2026-01-27 22:26:18.801 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquired lock "refresh_cache-4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:26:18 compute-1 nova_compute[183751]: 2026-01-27 22:26:18.801 183755 DEBUG nova.network.neutron [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:26:19 compute-1 openstack_network_exporter[195945]: ERROR   22:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:26:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:26:19 compute-1 openstack_network_exporter[195945]: ERROR   22:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:26:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:26:19 compute-1 nova_compute[183751]: 2026-01-27 22:26:19.556 183755 DEBUG nova.network.neutron [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:26:19 compute-1 nova_compute[183751]: 2026-01-27 22:26:19.807 183755 WARNING neutronclient.v2_0.client [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:26:19 compute-1 nova_compute[183751]: 2026-01-27 22:26:19.977 183755 DEBUG nova.network.neutron [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Updating instance_info_cache with network_info: [{"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.362 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.484 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Releasing lock "refresh_cache-4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.485 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Instance network_info: |[{"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.487 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Start _get_guest_xml network_info=[{"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.493 183755 WARNING nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.496 183755 DEBUG nova.virt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1330756688', uuid='4953f476-82d4-4a3e-ac76-b85ff3ad1d4f'), owner=OwnerMeta(userid='8763102ab7304e1d9f53b063264a3607', username='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin', projectid='6a4dc4388a0f4e9eb9abef43e0bc8df1', projectname='tempest-TestExecuteHostMaintenanceStrategy-1384268311'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769552780.4960396) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.503 183755 DEBUG nova.virt.libvirt.host [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.504 183755 DEBUG nova.virt.libvirt.host [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.508 183755 DEBUG nova.virt.libvirt.host [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.508 183755 DEBUG nova.virt.libvirt.host [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.510 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.510 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.511 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.511 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.511 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.512 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.512 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.512 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.512 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.513 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.513 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.513 183755 DEBUG nova.virt.hardware [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.518 183755 DEBUG nova.virt.libvirt.vif [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:26:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1330756688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1330756688',id=8,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a4dc4388a0f4e9eb9abef43e0bc8df1',ramdisk_id='',reservation_id='r-9hidts5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:26:15Z,user_data=None,user_id='8763102ab7304e1d9f53b063264a3607',uuid=4953f476-82d4-4a3e-ac76-b85ff3ad1d4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.519 183755 DEBUG nova.network.os_vif_util [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converting VIF {"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.520 183755 DEBUG nova.network.os_vif_util [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:20:5e,bridge_name='br-int',has_traffic_filtering=True,id=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8978ebf8-3b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:26:20 compute-1 nova_compute[183751]: 2026-01-27 22:26:20.521 183755 DEBUG nova.objects.instance [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.047 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <uuid>4953f476-82d4-4a3e-ac76-b85ff3ad1d4f</uuid>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <name>instance-00000008</name>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1330756688</nova:name>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:26:20</nova:creationTime>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:26:21 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:26:21 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:user uuid="8763102ab7304e1d9f53b063264a3607">tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin</nova:user>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:project uuid="6a4dc4388a0f4e9eb9abef43e0bc8df1">tempest-TestExecuteHostMaintenanceStrategy-1384268311</nova:project>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         <nova:port uuid="8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e">
Jan 27 22:26:21 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <system>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <entry name="serial">4953f476-82d4-4a3e-ac76-b85ff3ad1d4f</entry>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <entry name="uuid">4953f476-82d4-4a3e-ac76-b85ff3ad1d4f</entry>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </system>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <os>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   </os>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <features>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   </features>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk.config"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:50:20:5e"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <target dev="tap8978ebf8-3b"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/console.log" append="off"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <video>
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </video>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:26:21 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:26:21 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:26:21 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:26:21 compute-1 nova_compute[183751]: </domain>
Jan 27 22:26:21 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.048 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Preparing to wait for external event network-vif-plugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.048 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.049 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.049 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.050 183755 DEBUG nova.virt.libvirt.vif [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:26:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1330756688',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1330756688',id=8,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a4dc4388a0f4e9eb9abef43e0bc8df1',ramdisk_id='',reservation_id='r-9hidts5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:26:15Z,user_data=None,user_id='8763102ab7304e1d9f53b063264a3607',uuid=4953f476-82d4-4a3e-ac76-b85ff3ad1d4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.051 183755 DEBUG nova.network.os_vif_util [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converting VIF {"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.052 183755 DEBUG nova.network.os_vif_util [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:20:5e,bridge_name='br-int',has_traffic_filtering=True,id=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8978ebf8-3b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.053 183755 DEBUG os_vif [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:20:5e,bridge_name='br-int',has_traffic_filtering=True,id=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8978ebf8-3b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.054 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.056 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.057 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.058 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.059 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ec6de667-9af5-5a30-9a74-160a83b0b176', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.061 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.063 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.063 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.068 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.068 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8978ebf8-3b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.069 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap8978ebf8-3b, col_values=(('qos', UUID('93604307-1a42-4344-aa9e-0c3c41d9cad2')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.069 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap8978ebf8-3b, col_values=(('external_ids', {'iface-id': '8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:20:5e', 'vm-uuid': '4953f476-82d4-4a3e-ac76-b85ff3ad1d4f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.071 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:21 compute-1 NetworkManager[56069]: <info>  [1769552781.0726] manager: (tap8978ebf8-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.073 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.080 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:21 compute-1 nova_compute[183751]: 2026-01-27 22:26:21.081 183755 INFO os_vif [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:20:5e,bridge_name='br-int',has_traffic_filtering=True,id=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8978ebf8-3b')
Jan 27 22:26:22 compute-1 nova_compute[183751]: 2026-01-27 22:26:22.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:22 compute-1 nova_compute[183751]: 2026-01-27 22:26:22.640 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:26:22 compute-1 nova_compute[183751]: 2026-01-27 22:26:22.640 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:26:22 compute-1 nova_compute[183751]: 2026-01-27 22:26:22.641 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] No VIF found with MAC fa:16:3e:50:20:5e, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:26:22 compute-1 nova_compute[183751]: 2026-01-27 22:26:22.641 183755 INFO nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Using config drive
Jan 27 22:26:23 compute-1 nova_compute[183751]: 2026-01-27 22:26:23.154 183755 WARNING neutronclient.v2_0.client [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:26:23 compute-1 nova_compute[183751]: 2026-01-27 22:26:23.526 183755 INFO nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Creating config drive at /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk.config
Jan 27 22:26:23 compute-1 nova_compute[183751]: 2026-01-27 22:26:23.536 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpc24u8tqx execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:26:23 compute-1 nova_compute[183751]: 2026-01-27 22:26:23.676 183755 DEBUG oslo_concurrency.processutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpc24u8tqx" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:26:23 compute-1 kernel: tap8978ebf8-3b: entered promiscuous mode
Jan 27 22:26:23 compute-1 NetworkManager[56069]: <info>  [1769552783.7674] manager: (tap8978ebf8-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Jan 27 22:26:23 compute-1 ovn_controller[95969]: 2026-01-27T22:26:23Z|00066|binding|INFO|Claiming lport 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e for this chassis.
Jan 27 22:26:23 compute-1 ovn_controller[95969]: 2026-01-27T22:26:23Z|00067|binding|INFO|8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e: Claiming fa:16:3e:50:20:5e 10.100.0.6
Jan 27 22:26:23 compute-1 nova_compute[183751]: 2026-01-27 22:26:23.769 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.777 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:20:5e 10.100.0.6'], port_security=['fa:16:3e:50:20:5e 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4953f476-82d4-4a3e-ac76-b85ff3ad1d4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a4dc4388a0f4e9eb9abef43e0bc8df1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c5db407e-e957-436c-a734-22d98b76b6e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f757ff-7e1b-47d8-a25a-9e1dab5d0324, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.778 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e in datapath da4007d0-b29b-4778-a286-b5dd1155cf44 bound to our chassis
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.780 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da4007d0-b29b-4778-a286-b5dd1155cf44
Jan 27 22:26:23 compute-1 ovn_controller[95969]: 2026-01-27T22:26:23Z|00068|binding|INFO|Setting lport 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e ovn-installed in OVS
Jan 27 22:26:23 compute-1 ovn_controller[95969]: 2026-01-27T22:26:23Z|00069|binding|INFO|Setting lport 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e up in Southbound
Jan 27 22:26:23 compute-1 nova_compute[183751]: 2026-01-27 22:26:23.787 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:23 compute-1 nova_compute[183751]: 2026-01-27 22:26:23.792 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.805 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[98d3382b-491a-4c91-9fa7-5a1085ac7bb2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.807 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapda4007d0-b1 in ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.809 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapda4007d0-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.810 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[2a5ae96f-486c-4390-b773-7dea4fe93cdb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.810 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[60243c48-6cf5-44c9-a6f6-8abb13d502fe]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 systemd-udevd[217171]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:26:23 compute-1 systemd-machined[155034]: New machine qemu-4-instance-00000008.
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.825 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[82e6c30e-261d-466a-98b6-3c227436700d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-00000008.
Jan 27 22:26:23 compute-1 NetworkManager[56069]: <info>  [1769552783.8352] device (tap8978ebf8-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:26:23 compute-1 NetworkManager[56069]: <info>  [1769552783.8361] device (tap8978ebf8-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.846 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd4fb8e-4d99-4ec6-b5c9-9df49486d152]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.893 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[6d195ef1-1514-4c73-b6a5-a80067c296fd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 systemd-udevd[217175]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.899 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[85d0df36-da4f-48df-a439-94d0164f6959]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 NetworkManager[56069]: <info>  [1769552783.9010] manager: (tapda4007d0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.946 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[ab58c6ae-ce4c-4339-ad52-c8aa722382e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.950 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb4cf7f-22e9-48ee-bd61-09d74b503494]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:23 compute-1 NetworkManager[56069]: <info>  [1769552783.9841] device (tapda4007d0-b0): carrier: link connected
Jan 27 22:26:23 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:23.991 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc2f13e-b122-4171-a31a-350440ec41ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.012 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[8d524c5d-cefc-4a19-9437-174699e43d96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda4007d0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:a2:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 892625, 'reachable_time': 20556, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217203, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.030 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b084090f-4d2c-4f71-8673-d0545f400884]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:a269'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 892625, 'tstamp': 892625}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217204, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.048 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[d9cdc31a-54bd-4130-930a-3027d3005888]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda4007d0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:a2:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 892625, 'reachable_time': 20556, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217205, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.090 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[bc241ba8-548f-453f-919e-80958a48c4f5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.116 183755 DEBUG nova.compute.manager [req-81d867fe-8bca-4947-9585-66f7e0dd4a55 req-9d93e38f-5c5c-4da9-a6e7-2cce27ab1c36 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-vif-plugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.117 183755 DEBUG oslo_concurrency.lockutils [req-81d867fe-8bca-4947-9585-66f7e0dd4a55 req-9d93e38f-5c5c-4da9-a6e7-2cce27ab1c36 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.118 183755 DEBUG oslo_concurrency.lockutils [req-81d867fe-8bca-4947-9585-66f7e0dd4a55 req-9d93e38f-5c5c-4da9-a6e7-2cce27ab1c36 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.118 183755 DEBUG oslo_concurrency.lockutils [req-81d867fe-8bca-4947-9585-66f7e0dd4a55 req-9d93e38f-5c5c-4da9-a6e7-2cce27ab1c36 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.119 183755 DEBUG nova.compute.manager [req-81d867fe-8bca-4947-9585-66f7e0dd4a55 req-9d93e38f-5c5c-4da9-a6e7-2cce27ab1c36 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Processing event network-vif-plugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.190 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[8159c968-5e1f-4aad-be06-ff0985c20444]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.192 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda4007d0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.192 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.193 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda4007d0-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.195 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:24 compute-1 NetworkManager[56069]: <info>  [1769552784.1961] manager: (tapda4007d0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 27 22:26:24 compute-1 kernel: tapda4007d0-b0: entered promiscuous mode
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.197 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.199 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda4007d0-b0, col_values=(('external_ids', {'iface-id': 'b2c930e7-5341-4fb1-8625-e406ed44ab80'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.200 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:24 compute-1 ovn_controller[95969]: 2026-01-27T22:26:24Z|00070|binding|INFO|Releasing lport b2c930e7-5341-4fb1-8625-e406ed44ab80 from this chassis (sb_readonly=0)
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.228 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.229 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[dd314ed5-61d0-4a67-8100-69fefd6e4097]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.230 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.230 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.230 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for da4007d0-b29b-4778-a286-b5dd1155cf44 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.231 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.231 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[11615326-800e-423d-8bcc-318b827ad2f3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.231 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.232 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[41f7ba62-492a-4bdb-8a3e-244605f5b2ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.232 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-da4007d0-b29b-4778-a286-b5dd1155cf44
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID da4007d0-b29b-4778-a286-b5dd1155cf44
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:26:24 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:24.233 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'env', 'PROCESS_TAG=haproxy-da4007d0-b29b-4778-a286-b5dd1155cf44', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/da4007d0-b29b-4778-a286-b5dd1155cf44.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.514 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.518 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.521 183755 INFO nova.virt.libvirt.driver [-] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Instance spawned successfully.
Jan 27 22:26:24 compute-1 nova_compute[183751]: 2026-01-27 22:26:24.522 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:26:24 compute-1 podman[217244]: 2026-01-27 22:26:24.723283637 +0000 UTC m=+0.080140204 container create 27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 27 22:26:24 compute-1 systemd[1]: Started libpod-conmon-27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252.scope.
Jan 27 22:26:24 compute-1 podman[217244]: 2026-01-27 22:26:24.682834856 +0000 UTC m=+0.039691483 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:26:24 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:26:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31c40195879afd5dfc07a5f374cf79e035e6e14ef124eb0ffc47570c1aa6d035/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:26:24 compute-1 podman[217244]: 2026-01-27 22:26:24.823698263 +0000 UTC m=+0.180554830 container init 27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Jan 27 22:26:24 compute-1 podman[217244]: 2026-01-27 22:26:24.834349387 +0000 UTC m=+0.191205924 container start 27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 27 22:26:24 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[217259]: [NOTICE]   (217263) : New worker (217265) forked
Jan 27 22:26:24 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[217259]: [NOTICE]   (217263) : Loading success.
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.035 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.037 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.037 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.038 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.038 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.039 183755 DEBUG nova.virt.libvirt.driver [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.552 183755 INFO nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Took 8.66 seconds to spawn the instance on the hypervisor.
Jan 27 22:26:25 compute-1 nova_compute[183751]: 2026-01-27 22:26:25.554 183755 DEBUG nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.079 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.149 183755 INFO nova.compute.manager [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Took 14.01 seconds to build instance.
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.236 183755 DEBUG nova.compute.manager [req-c5b30d54-628f-46a9-8391-34ec726fc2fe req-6130f1ae-f051-4ebe-9f78-f7e85f31785c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-vif-plugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.237 183755 DEBUG oslo_concurrency.lockutils [req-c5b30d54-628f-46a9-8391-34ec726fc2fe req-6130f1ae-f051-4ebe-9f78-f7e85f31785c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.237 183755 DEBUG oslo_concurrency.lockutils [req-c5b30d54-628f-46a9-8391-34ec726fc2fe req-6130f1ae-f051-4ebe-9f78-f7e85f31785c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.238 183755 DEBUG oslo_concurrency.lockutils [req-c5b30d54-628f-46a9-8391-34ec726fc2fe req-6130f1ae-f051-4ebe-9f78-f7e85f31785c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.239 183755 DEBUG nova.compute.manager [req-c5b30d54-628f-46a9-8391-34ec726fc2fe req-6130f1ae-f051-4ebe-9f78-f7e85f31785c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] No waiting events found dispatching network-vif-plugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.239 183755 WARNING nova.compute.manager [req-c5b30d54-628f-46a9-8391-34ec726fc2fe req-6130f1ae-f051-4ebe-9f78-f7e85f31785c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received unexpected event network-vif-plugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e for instance with vm_state active and task_state None.
Jan 27 22:26:26 compute-1 nova_compute[183751]: 2026-01-27 22:26:26.657 183755 DEBUG oslo_concurrency.lockutils [None req-bb13c305-990b-477c-b61b-2250bdea76b2 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.536s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:28 compute-1 nova_compute[183751]: 2026-01-27 22:26:28.912 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:28 compute-1 nova_compute[183751]: 2026-01-27 22:26:28.913 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:28 compute-1 nova_compute[183751]: 2026-01-27 22:26:28.913 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:28 compute-1 nova_compute[183751]: 2026-01-27 22:26:28.914 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:28 compute-1 nova_compute[183751]: 2026-01-27 22:26:28.914 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:28 compute-1 nova_compute[183751]: 2026-01-27 22:26:28.925 183755 INFO nova.compute.manager [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Terminating instance
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.440 183755 DEBUG nova.compute.manager [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:26:29 compute-1 kernel: tap8978ebf8-3b (unregistering): left promiscuous mode
Jan 27 22:26:29 compute-1 NetworkManager[56069]: <info>  [1769552789.4710] device (tap8978ebf8-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:26:29 compute-1 ovn_controller[95969]: 2026-01-27T22:26:29Z|00071|binding|INFO|Releasing lport 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e from this chassis (sb_readonly=0)
Jan 27 22:26:29 compute-1 ovn_controller[95969]: 2026-01-27T22:26:29Z|00072|binding|INFO|Setting lport 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e down in Southbound
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.525 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:29 compute-1 ovn_controller[95969]: 2026-01-27T22:26:29Z|00073|binding|INFO|Removing iface tap8978ebf8-3b ovn-installed in OVS
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.528 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.536 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:20:5e 10.100.0.6'], port_security=['fa:16:3e:50:20:5e 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4953f476-82d4-4a3e-ac76-b85ff3ad1d4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da4007d0-b29b-4778-a286-b5dd1155cf44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a4dc4388a0f4e9eb9abef43e0bc8df1', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c5db407e-e957-436c-a734-22d98b76b6e9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f757ff-7e1b-47d8-a25a-9e1dab5d0324, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.537 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e in datapath da4007d0-b29b-4778-a286-b5dd1155cf44 unbound from our chassis
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.538 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network da4007d0-b29b-4778-a286-b5dd1155cf44, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.538 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[aed1e53d-d9fa-40b5-a04c-545cc7e64f12]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.538 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 namespace which is not needed anymore
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.550 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:29 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 27 22:26:29 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 5.719s CPU time.
Jan 27 22:26:29 compute-1 systemd-machined[155034]: Machine qemu-4-instance-00000008 terminated.
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.724 183755 INFO nova.virt.libvirt.driver [-] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Instance destroyed successfully.
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.726 183755 DEBUG nova.objects.instance [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lazy-loading 'resources' on Instance uuid 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:26:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[217259]: [NOTICE]   (217263) : haproxy version is 3.0.5-8e879a5
Jan 27 22:26:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[217259]: [NOTICE]   (217263) : path to executable is /usr/sbin/haproxy
Jan 27 22:26:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[217259]: [WARNING]  (217263) : Exiting Master process...
Jan 27 22:26:29 compute-1 podman[217299]: 2026-01-27 22:26:29.729655308 +0000 UTC m=+0.053249319 container kill 27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 27 22:26:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[217259]: [ALERT]    (217263) : Current worker (217265) exited with code 143 (Terminated)
Jan 27 22:26:29 compute-1 neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44[217259]: [WARNING]  (217263) : All workers exited. Exiting... (0)
Jan 27 22:26:29 compute-1 systemd[1]: libpod-27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252.scope: Deactivated successfully.
Jan 27 22:26:29 compute-1 podman[217327]: 2026-01-27 22:26:29.787621152 +0000 UTC m=+0.036975137 container died 27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 27 22:26:29 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252-userdata-shm.mount: Deactivated successfully.
Jan 27 22:26:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-31c40195879afd5dfc07a5f374cf79e035e6e14ef124eb0ffc47570c1aa6d035-merged.mount: Deactivated successfully.
Jan 27 22:26:29 compute-1 podman[217327]: 2026-01-27 22:26:29.842706936 +0000 UTC m=+0.092060830 container cleanup 27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 22:26:29 compute-1 systemd[1]: libpod-conmon-27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252.scope: Deactivated successfully.
Jan 27 22:26:29 compute-1 podman[217334]: 2026-01-27 22:26:29.865436379 +0000 UTC m=+0.091619629 container remove 27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.890 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c4319786-2de2-4615-a684-ef590533ea73]: (4, ("Tue Jan 27 10:26:29 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 (27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252)\n27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252\nTue Jan 27 10:26:29 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 (27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252)\n27cd5a699d3a3af2aab1ce05720c5e2f2b1e9c879136202e41d272a26ea4d252\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.892 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[e46056b7-e420-4001-a912-3b396aa7a6fb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.893 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da4007d0-b29b-4778-a286-b5dd1155cf44.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.894 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[19846a56-c41e-491d-b647-9d8972d46b9b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.895 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda4007d0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.897 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:29 compute-1 kernel: tapda4007d0-b0: left promiscuous mode
Jan 27 22:26:29 compute-1 nova_compute[183751]: 2026-01-27 22:26:29.928 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.930 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[499bad3f-b4b6-4a4d-ade3-1e20d6e88b05]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.950 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[9c304ba2-6455-4e05-8121-05c0602d13cf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.951 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[51b842d0-720b-4002-a94d-64803449e0cb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.974 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f318c353-b4fd-42df-91eb-04a2850228cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 892614, 'reachable_time': 19622, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217362, 'error': None, 'target': 'ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.975 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-da4007d0-b29b-4778-a286-b5dd1155cf44 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:26:29 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:26:29.975 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[053a2e36-b9bf-4ce7-9815-fd388c5d6fcb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:26:29 compute-1 systemd[1]: run-netns-ovnmeta\x2dda4007d0\x2db29b\x2d4778\x2da286\x2db5dd1155cf44.mount: Deactivated successfully.
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.236 183755 DEBUG nova.virt.libvirt.vif [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:26:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1330756688',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1330756688',id=8,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:26:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a4dc4388a0f4e9eb9abef43e0bc8df1',ramdisk_id='',reservation_id='r-9hidts5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1384268311-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:26:25Z,user_data=None,user_id='8763102ab7304e1d9f53b063264a3607',uuid=4953f476-82d4-4a3e-ac76-b85ff3ad1d4f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.237 183755 DEBUG nova.network.os_vif_util [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converting VIF {"id": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "address": "fa:16:3e:50:20:5e", "network": {"id": "da4007d0-b29b-4778-a286-b5dd1155cf44", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1394650910-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a965ed27f6ee4c02a2538e87cb3ecdeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8978ebf8-3b", "ovs_interfaceid": "8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.238 183755 DEBUG nova.network.os_vif_util [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:20:5e,bridge_name='br-int',has_traffic_filtering=True,id=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8978ebf8-3b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.238 183755 DEBUG os_vif [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:20:5e,bridge_name='br-int',has_traffic_filtering=True,id=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8978ebf8-3b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.241 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.241 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8978ebf8-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.243 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.245 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.246 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.247 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=93604307-1a42-4344-aa9e-0c3c41d9cad2) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.248 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.250 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.252 183755 INFO os_vif [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:20:5e,bridge_name='br-int',has_traffic_filtering=True,id=8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e,network=Network(da4007d0-b29b-4778-a286-b5dd1155cf44),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8978ebf8-3b')
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.253 183755 INFO nova.virt.libvirt.driver [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Deleting instance files /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f_del
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.255 183755 INFO nova.virt.libvirt.driver [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Deletion of /var/lib/nova/instances/4953f476-82d4-4a3e-ac76-b85ff3ad1d4f_del complete
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.423 183755 DEBUG nova.compute.manager [req-921b8946-9a19-44cb-ae0e-cce285d2e172 req-97b11666-c5b9-4079-b36e-ea20808d07c6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-vif-unplugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.423 183755 DEBUG oslo_concurrency.lockutils [req-921b8946-9a19-44cb-ae0e-cce285d2e172 req-97b11666-c5b9-4079-b36e-ea20808d07c6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.424 183755 DEBUG oslo_concurrency.lockutils [req-921b8946-9a19-44cb-ae0e-cce285d2e172 req-97b11666-c5b9-4079-b36e-ea20808d07c6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.424 183755 DEBUG oslo_concurrency.lockutils [req-921b8946-9a19-44cb-ae0e-cce285d2e172 req-97b11666-c5b9-4079-b36e-ea20808d07c6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.424 183755 DEBUG nova.compute.manager [req-921b8946-9a19-44cb-ae0e-cce285d2e172 req-97b11666-c5b9-4079-b36e-ea20808d07c6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] No waiting events found dispatching network-vif-unplugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.424 183755 DEBUG nova.compute.manager [req-921b8946-9a19-44cb-ae0e-cce285d2e172 req-97b11666-c5b9-4079-b36e-ea20808d07c6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-vif-unplugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.769 183755 INFO nova.compute.manager [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Took 1.33 seconds to destroy the instance on the hypervisor.
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.769 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.770 183755 DEBUG nova.compute.manager [-] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.770 183755 DEBUG nova.network.neutron [-] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:26:30 compute-1 nova_compute[183751]: 2026-01-27 22:26:30.770 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:26:31 compute-1 nova_compute[183751]: 2026-01-27 22:26:31.080 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:31 compute-1 nova_compute[183751]: 2026-01-27 22:26:31.500 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.257 183755 DEBUG nova.network.neutron [-] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.505 183755 DEBUG nova.compute.manager [req-2d122ebf-568d-4175-9419-1b19cebcf418 req-241d1a71-9ef6-434a-a3c4-1680677ae324 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-vif-unplugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.506 183755 DEBUG oslo_concurrency.lockutils [req-2d122ebf-568d-4175-9419-1b19cebcf418 req-241d1a71-9ef6-434a-a3c4-1680677ae324 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.506 183755 DEBUG oslo_concurrency.lockutils [req-2d122ebf-568d-4175-9419-1b19cebcf418 req-241d1a71-9ef6-434a-a3c4-1680677ae324 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.506 183755 DEBUG oslo_concurrency.lockutils [req-2d122ebf-568d-4175-9419-1b19cebcf418 req-241d1a71-9ef6-434a-a3c4-1680677ae324 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.506 183755 DEBUG nova.compute.manager [req-2d122ebf-568d-4175-9419-1b19cebcf418 req-241d1a71-9ef6-434a-a3c4-1680677ae324 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] No waiting events found dispatching network-vif-unplugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.507 183755 DEBUG nova.compute.manager [req-2d122ebf-568d-4175-9419-1b19cebcf418 req-241d1a71-9ef6-434a-a3c4-1680677ae324 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-vif-unplugged-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.507 183755 DEBUG nova.compute.manager [req-2d122ebf-568d-4175-9419-1b19cebcf418 req-241d1a71-9ef6-434a-a3c4-1680677ae324 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Received event network-vif-deleted-8978ebf8-3bcd-4f5c-821e-06d2db0d7e0e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:26:32 compute-1 nova_compute[183751]: 2026-01-27 22:26:32.763 183755 INFO nova.compute.manager [-] [instance: 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f] Took 1.99 seconds to deallocate network for instance.
Jan 27 22:26:33 compute-1 nova_compute[183751]: 2026-01-27 22:26:33.288 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:33 compute-1 nova_compute[183751]: 2026-01-27 22:26:33.289 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:33 compute-1 nova_compute[183751]: 2026-01-27 22:26:33.363 183755 DEBUG nova.compute.provider_tree [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:26:33 compute-1 nova_compute[183751]: 2026-01-27 22:26:33.871 183755 DEBUG nova.scheduler.client.report [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:26:34 compute-1 nova_compute[183751]: 2026-01-27 22:26:34.384 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:34 compute-1 nova_compute[183751]: 2026-01-27 22:26:34.569 183755 INFO nova.scheduler.client.report [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Deleted allocations for instance 4953f476-82d4-4a3e-ac76-b85ff3ad1d4f
Jan 27 22:26:35 compute-1 nova_compute[183751]: 2026-01-27 22:26:35.249 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:35 compute-1 nova_compute[183751]: 2026-01-27 22:26:35.602 183755 DEBUG oslo_concurrency.lockutils [None req-85bbf3e2-edb8-4eff-88ed-80ef42badb03 8763102ab7304e1d9f53b063264a3607 6a4dc4388a0f4e9eb9abef43e0bc8df1 - - default default] Lock "4953f476-82d4-4a3e-ac76-b85ff3ad1d4f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.689s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:35 compute-1 podman[193064]: time="2026-01-27T22:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:26:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:26:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 27 22:26:35 compute-1 podman[217364]: 2026-01-27 22:26:35.875537634 +0000 UTC m=+0.164021550 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 22:26:36 compute-1 nova_compute[183751]: 2026-01-27 22:26:36.082 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:40 compute-1 nova_compute[183751]: 2026-01-27 22:26:40.250 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:40 compute-1 podman[217392]: 2026-01-27 22:26:40.799490015 +0000 UTC m=+0.093888945 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Jan 27 22:26:40 compute-1 podman[217391]: 2026-01-27 22:26:40.802587082 +0000 UTC m=+0.107766929 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:26:41 compute-1 nova_compute[183751]: 2026-01-27 22:26:41.085 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:45 compute-1 nova_compute[183751]: 2026-01-27 22:26:45.291 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:46 compute-1 nova_compute[183751]: 2026-01-27 22:26:46.087 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:48 compute-1 nova_compute[183751]: 2026-01-27 22:26:48.416 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:49 compute-1 openstack_network_exporter[195945]: ERROR   22:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:26:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:26:49 compute-1 openstack_network_exporter[195945]: ERROR   22:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:26:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:26:49 compute-1 podman[217430]: 2026-01-27 22:26:49.772492871 +0000 UTC m=+0.074723200 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:26:50 compute-1 nova_compute[183751]: 2026-01-27 22:26:50.299 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:51 compute-1 nova_compute[183751]: 2026-01-27 22:26:51.090 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:55 compute-1 nova_compute[183751]: 2026-01-27 22:26:55.302 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:56 compute-1 nova_compute[183751]: 2026-01-27 22:26:56.093 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:26:57 compute-1 nova_compute[183751]: 2026-01-27 22:26:57.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.696 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.697 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.697 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.697 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.919 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.921 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.962 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.963 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5789MB free_disk=73.14263916015625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.963 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:26:58 compute-1 nova_compute[183751]: 2026-01-27 22:26:58.964 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:00 compute-1 nova_compute[183751]: 2026-01-27 22:27:00.027 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:27:00 compute-1 nova_compute[183751]: 2026-01-27 22:27:00.027 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:26:58 up  2:29,  0 user,  load average: 0.56, 0.29, 0.14\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:27:00 compute-1 nova_compute[183751]: 2026-01-27 22:27:00.051 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:27:00 compute-1 nova_compute[183751]: 2026-01-27 22:27:00.305 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:00 compute-1 nova_compute[183751]: 2026-01-27 22:27:00.559 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:27:01 compute-1 nova_compute[183751]: 2026-01-27 22:27:01.095 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:01 compute-1 nova_compute[183751]: 2026-01-27 22:27:01.262 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:27:01 compute-1 nova_compute[183751]: 2026-01-27 22:27:01.262 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.298s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:27:01 compute-1 nova_compute[183751]: 2026-01-27 22:27:01.262 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:01 compute-1 nova_compute[183751]: 2026-01-27 22:27:01.263 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:27:01 compute-1 nova_compute[183751]: 2026-01-27 22:27:01.772 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:27:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:02.745 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4d:2b:21 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e10517e36e1d445eb9e5770571d01f35', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4db3b16f-6743-4c2d-a5f9-9b2b6d3313bd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6e012cf2-42c8-4ee7-a370-6ad1e6e56131) old=Port_Binding(mac=['fa:16:3e:4d:2b:21'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e10517e36e1d445eb9e5770571d01f35', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:27:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:02.746 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6e012cf2-42c8-4ee7-a370-6ad1e6e56131 in datapath d5f6cbcb-0015-404b-a15e-d163be3d6b1a updated
Jan 27 22:27:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:02.747 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f6cbcb-0015-404b-a15e-d163be3d6b1a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:27:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:02.750 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[44433170-cbfa-465b-baae-cc52a8d857b7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:27:03 compute-1 nova_compute[183751]: 2026-01-27 22:27:03.767 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:03 compute-1 nova_compute[183751]: 2026-01-27 22:27:03.768 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:03 compute-1 nova_compute[183751]: 2026-01-27 22:27:03.768 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:03 compute-1 nova_compute[183751]: 2026-01-27 22:27:03.768 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:03 compute-1 nova_compute[183751]: 2026-01-27 22:27:03.768 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:27:05 compute-1 nova_compute[183751]: 2026-01-27 22:27:05.308 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:05 compute-1 podman[193064]: time="2026-01-27T22:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:27:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:27:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 27 22:27:06 compute-1 nova_compute[183751]: 2026-01-27 22:27:06.128 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:06 compute-1 podman[217458]: 2026-01-27 22:27:06.833054193 +0000 UTC m=+0.134521851 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Jan 27 22:27:07 compute-1 nova_compute[183751]: 2026-01-27 22:27:07.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:07 compute-1 nova_compute[183751]: 2026-01-27 22:27:07.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:27:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:09.237 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:27:09 compute-1 nova_compute[183751]: 2026-01-27 22:27:09.237 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:09.239 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:27:10 compute-1 nova_compute[183751]: 2026-01-27 22:27:10.310 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:11 compute-1 nova_compute[183751]: 2026-01-27 22:27:11.130 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:11.272 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:11.273 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:11.273 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:27:11 compute-1 podman[217486]: 2026-01-27 22:27:11.779571923 +0000 UTC m=+0.082147905 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1755695350)
Jan 27 22:27:11 compute-1 podman[217487]: 2026-01-27 22:27:11.788847812 +0000 UTC m=+0.088267366 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 22:27:15 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:15.241 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:27:15 compute-1 nova_compute[183751]: 2026-01-27 22:27:15.312 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:16 compute-1 nova_compute[183751]: 2026-01-27 22:27:16.132 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:18 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:18.073 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:6c:f8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-561d4e71-a4ff-42dd-801c-03bd6311f8a9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-561d4e71-a4ff-42dd-801c-03bd6311f8a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a3669bc840040159a7655f1b219810c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c1133aa-f532-44cb-b980-f2039ce4f59a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4fdfa393-7290-422a-974d-adee420ce2f9) old=Port_Binding(mac=['fa:16:3e:a1:6c:f8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-561d4e71-a4ff-42dd-801c-03bd6311f8a9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-561d4e71-a4ff-42dd-801c-03bd6311f8a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a3669bc840040159a7655f1b219810c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:27:18 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:18.075 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4fdfa393-7290-422a-974d-adee420ce2f9 in datapath 561d4e71-a4ff-42dd-801c-03bd6311f8a9 updated
Jan 27 22:27:18 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:18.076 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 561d4e71-a4ff-42dd-801c-03bd6311f8a9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:27:18 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:18.076 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[daf83043-1b0d-48fe-8e7b-58dcb53bf70f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:27:19 compute-1 openstack_network_exporter[195945]: ERROR   22:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:27:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:27:19 compute-1 openstack_network_exporter[195945]: ERROR   22:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:27:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:27:20 compute-1 nova_compute[183751]: 2026-01-27 22:27:20.314 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:20 compute-1 podman[217528]: 2026-01-27 22:27:20.801804698 +0000 UTC m=+0.110136327 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:27:21 compute-1 nova_compute[183751]: 2026-01-27 22:27:21.133 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:22 compute-1 ovn_controller[95969]: 2026-01-27T22:27:22Z|00074|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 27 22:27:25 compute-1 nova_compute[183751]: 2026-01-27 22:27:25.316 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:26 compute-1 nova_compute[183751]: 2026-01-27 22:27:26.136 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:29 compute-1 nova_compute[183751]: 2026-01-27 22:27:29.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:30 compute-1 nova_compute[183751]: 2026-01-27 22:27:30.319 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:31 compute-1 nova_compute[183751]: 2026-01-27 22:27:31.138 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:35 compute-1 nova_compute[183751]: 2026-01-27 22:27:35.321 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:35 compute-1 podman[193064]: time="2026-01-27T22:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:27:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:27:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 27 22:27:36 compute-1 nova_compute[183751]: 2026-01-27 22:27:36.150 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:37 compute-1 podman[217553]: 2026-01-27 22:27:37.832097222 +0000 UTC m=+0.135287879 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 27 22:27:40 compute-1 nova_compute[183751]: 2026-01-27 22:27:40.323 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:41 compute-1 nova_compute[183751]: 2026-01-27 22:27:41.154 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:42 compute-1 podman[217580]: 2026-01-27 22:27:42.784121719 +0000 UTC m=+0.082404551 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 27 22:27:42 compute-1 podman[217579]: 2026-01-27 22:27:42.809768554 +0000 UTC m=+0.117354886 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Jan 27 22:27:45 compute-1 nova_compute[183751]: 2026-01-27 22:27:45.324 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:46 compute-1 nova_compute[183751]: 2026-01-27 22:27:46.157 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:49 compute-1 openstack_network_exporter[195945]: ERROR   22:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:27:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:27:49 compute-1 openstack_network_exporter[195945]: ERROR   22:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:27:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:27:50 compute-1 nova_compute[183751]: 2026-01-27 22:27:50.326 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:51 compute-1 nova_compute[183751]: 2026-01-27 22:27:51.190 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:51 compute-1 podman[217617]: 2026-01-27 22:27:51.773176441 +0000 UTC m=+0.076889074 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:27:52 compute-1 nova_compute[183751]: 2026-01-27 22:27:52.351 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:52 compute-1 nova_compute[183751]: 2026-01-27 22:27:52.351 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:52 compute-1 nova_compute[183751]: 2026-01-27 22:27:52.858 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:27:53 compute-1 nova_compute[183751]: 2026-01-27 22:27:53.541 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:53 compute-1 nova_compute[183751]: 2026-01-27 22:27:53.542 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:53 compute-1 nova_compute[183751]: 2026-01-27 22:27:53.550 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:27:53 compute-1 nova_compute[183751]: 2026-01-27 22:27:53.550 183755 INFO nova.compute.claims [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:27:54 compute-1 nova_compute[183751]: 2026-01-27 22:27:54.755 183755 DEBUG nova.compute.provider_tree [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:27:55 compute-1 nova_compute[183751]: 2026-01-27 22:27:55.265 183755 DEBUG nova.scheduler.client.report [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:27:55 compute-1 nova_compute[183751]: 2026-01-27 22:27:55.329 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:55 compute-1 nova_compute[183751]: 2026-01-27 22:27:55.783 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.241s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:27:55 compute-1 nova_compute[183751]: 2026-01-27 22:27:55.784 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:27:56 compute-1 nova_compute[183751]: 2026-01-27 22:27:56.194 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:56 compute-1 nova_compute[183751]: 2026-01-27 22:27:56.297 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:27:56 compute-1 nova_compute[183751]: 2026-01-27 22:27:56.298 183755 DEBUG nova.network.neutron [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:27:56 compute-1 nova_compute[183751]: 2026-01-27 22:27:56.299 183755 WARNING neutronclient.v2_0.client [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:27:56 compute-1 nova_compute[183751]: 2026-01-27 22:27:56.299 183755 WARNING neutronclient.v2_0.client [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:27:56 compute-1 nova_compute[183751]: 2026-01-27 22:27:56.806 183755 INFO nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:27:57 compute-1 nova_compute[183751]: 2026-01-27 22:27:57.321 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:27:57 compute-1 nova_compute[183751]: 2026-01-27 22:27:57.654 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.344 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.346 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.347 183755 INFO nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Creating image(s)
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.348 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "/var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.348 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "/var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.350 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "/var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.351 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.358 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.361 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.456 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.457 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.458 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.459 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.465 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.465 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.523 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.524 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.565 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.566 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.567 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:27:58 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:58.592 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.592 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:27:58 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:27:58.594 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.648 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.649 183755 DEBUG nova.virt.disk.api [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Checking if we can resize image /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.649 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.668 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.722 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.723 183755 DEBUG nova.virt.disk.api [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Cannot resize image /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.723 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.724 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Ensure instance console log exists: /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.724 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.724 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.725 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.745 183755 DEBUG nova.network.neutron [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Successfully created port: 7aad95c9-8de3-4ca5-a809-4c78b89d323a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.901 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.905 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.935 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.936 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5826MB free_disk=73.14239883422852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.937 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:27:58 compute-1 nova_compute[183751]: 2026-01-27 22:27:58.937 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:27:59 compute-1 nova_compute[183751]: 2026-01-27 22:27:59.355 183755 DEBUG nova.network.neutron [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Successfully updated port: 7aad95c9-8de3-4ca5-a809-4c78b89d323a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:27:59 compute-1 nova_compute[183751]: 2026-01-27 22:27:59.465 183755 DEBUG nova.compute.manager [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-changed-7aad95c9-8de3-4ca5-a809-4c78b89d323a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:27:59 compute-1 nova_compute[183751]: 2026-01-27 22:27:59.465 183755 DEBUG nova.compute.manager [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Refreshing instance network info cache due to event network-changed-7aad95c9-8de3-4ca5-a809-4c78b89d323a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:27:59 compute-1 nova_compute[183751]: 2026-01-27 22:27:59.466 183755 DEBUG oslo_concurrency.lockutils [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-ff165b27-6b1d-4da4-83de-b6f3a7913776" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:27:59 compute-1 nova_compute[183751]: 2026-01-27 22:27:59.466 183755 DEBUG oslo_concurrency.lockutils [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-ff165b27-6b1d-4da4-83de-b6f3a7913776" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:27:59 compute-1 nova_compute[183751]: 2026-01-27 22:27:59.467 183755 DEBUG nova.network.neutron [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Refreshing network info cache for port 7aad95c9-8de3-4ca5-a809-4c78b89d323a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:27:59 compute-1 nova_compute[183751]: 2026-01-27 22:27:59.862 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "refresh_cache-ff165b27-6b1d-4da4-83de-b6f3a7913776" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.008 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Instance ff165b27-6b1d-4da4-83de-b6f3a7913776 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.009 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.010 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:27:58 up  2:30,  0 user,  load average: 0.20, 0.23, 0.13\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_1a3669bc840040159a7655f1b219810c': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.028 183755 WARNING neutronclient.v2_0.client [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.088 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.157 183755 DEBUG nova.network.neutron [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.331 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.387 183755 DEBUG nova.network.neutron [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.598 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.896 183755 DEBUG oslo_concurrency.lockutils [req-5fcd0bf5-8a37-40f3-af95-206a50b37ce7 req-2a2ef850-0469-4cd3-9be8-9c8d5b0e1444 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-ff165b27-6b1d-4da4-83de-b6f3a7913776" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.897 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquired lock "refresh_cache-ff165b27-6b1d-4da4-83de-b6f3a7913776" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:28:00 compute-1 nova_compute[183751]: 2026-01-27 22:28:00.897 183755 DEBUG nova.network.neutron [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:28:01 compute-1 nova_compute[183751]: 2026-01-27 22:28:01.108 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:28:01 compute-1 nova_compute[183751]: 2026-01-27 22:28:01.108 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.171s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:01 compute-1 nova_compute[183751]: 2026-01-27 22:28:01.236 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:02 compute-1 nova_compute[183751]: 2026-01-27 22:28:02.071 183755 DEBUG nova.network.neutron [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:28:02 compute-1 nova_compute[183751]: 2026-01-27 22:28:02.109 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:02 compute-1 nova_compute[183751]: 2026-01-27 22:28:02.110 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:02 compute-1 nova_compute[183751]: 2026-01-27 22:28:02.110 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:02 compute-1 nova_compute[183751]: 2026-01-27 22:28:02.111 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:02 compute-1 nova_compute[183751]: 2026-01-27 22:28:02.424 183755 WARNING neutronclient.v2_0.client [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.052 183755 DEBUG nova.network.neutron [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Updating instance_info_cache with network_info: [{"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.562 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Releasing lock "refresh_cache-ff165b27-6b1d-4da4-83de-b6f3a7913776" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.562 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Instance network_info: |[{"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.567 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Start _get_guest_xml network_info=[{"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.574 183755 WARNING nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.576 183755 DEBUG nova.virt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1588832548', uuid='ff165b27-6b1d-4da4-83de-b6f3a7913776'), owner=OwnerMeta(userid='4b9b219e067b4a669e9564e586cb41cd', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin', projectid='1a3669bc840040159a7655f1b219810c', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769552883.576736) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.581 183755 DEBUG nova.virt.libvirt.host [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.582 183755 DEBUG nova.virt.libvirt.host [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.585 183755 DEBUG nova.virt.libvirt.host [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.586 183755 DEBUG nova.virt.libvirt.host [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.587 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.588 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.588 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.589 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.589 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.589 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.590 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.590 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.591 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.591 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.591 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.592 183755 DEBUG nova.virt.hardware [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.597 183755 DEBUG nova.virt.libvirt.vif [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:27:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1588832548',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-158',id=10,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a3669bc840040159a7655f1b219810c',ramdisk_id='',reservation_id='r-oht7c44j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:27:57Z,user_data=None,user_id='4b9b219e067b4a669e9564e586cb41cd',uuid=ff165b27-6b1d-4da4-83de-b6f3a7913776,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.597 183755 DEBUG nova.network.os_vif_util [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converting VIF {"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.598 183755 DEBUG nova.network.os_vif_util [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:eb:30,bridge_name='br-int',has_traffic_filtering=True,id=7aad95c9-8de3-4ca5-a809-4c78b89d323a,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad95c9-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:28:03 compute-1 nova_compute[183751]: 2026-01-27 22:28:03.600 183755 DEBUG nova.objects.instance [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lazy-loading 'pci_devices' on Instance uuid ff165b27-6b1d-4da4-83de-b6f3a7913776 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.110 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <uuid>ff165b27-6b1d-4da4-83de-b6f3a7913776</uuid>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <name>instance-0000000a</name>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1588832548</nova:name>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:28:03</nova:creationTime>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:28:04 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:28:04 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:user uuid="4b9b219e067b4a669e9564e586cb41cd">tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin</nova:user>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:project uuid="1a3669bc840040159a7655f1b219810c">tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885</nova:project>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         <nova:port uuid="7aad95c9-8de3-4ca5-a809-4c78b89d323a">
Jan 27 22:28:04 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <system>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <entry name="serial">ff165b27-6b1d-4da4-83de-b6f3a7913776</entry>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <entry name="uuid">ff165b27-6b1d-4da4-83de-b6f3a7913776</entry>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </system>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <os>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   </os>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <features>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   </features>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk.config"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:af:eb:30"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <target dev="tap7aad95c9-8d"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/console.log" append="off"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <video>
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </video>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:28:04 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:28:04 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:28:04 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:28:04 compute-1 nova_compute[183751]: </domain>
Jan 27 22:28:04 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.112 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Preparing to wait for external event network-vif-plugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.113 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.113 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.113 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.114 183755 DEBUG nova.virt.libvirt.vif [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:27:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1588832548',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-158',id=10,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a3669bc840040159a7655f1b219810c',ramdisk_id='',reservation_id='r-oht7c44j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:27:57Z,user_data=None,user_id='4b9b219e067b4a669e9564e586cb41cd',uuid=ff165b27-6b1d-4da4-83de-b6f3a7913776,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.114 183755 DEBUG nova.network.os_vif_util [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converting VIF {"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.115 183755 DEBUG nova.network.os_vif_util [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:eb:30,bridge_name='br-int',has_traffic_filtering=True,id=7aad95c9-8de3-4ca5-a809-4c78b89d323a,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad95c9-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.115 183755 DEBUG os_vif [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:eb:30,bridge_name='br-int',has_traffic_filtering=True,id=7aad95c9-8de3-4ca5-a809-4c78b89d323a,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad95c9-8d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.116 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.116 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.116 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.117 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.117 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7ddf5590-611a-5988-b5ba-20f547ae5802', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.119 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.119 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.121 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.126 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.126 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7aad95c9-8d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.127 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7aad95c9-8d, col_values=(('qos', UUID('c9222467-586c-4af4-94d0-b5d569aec7aa')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.128 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7aad95c9-8d, col_values=(('external_ids', {'iface-id': '7aad95c9-8de3-4ca5-a809-4c78b89d323a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:eb:30', 'vm-uuid': 'ff165b27-6b1d-4da4-83de-b6f3a7913776'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.130 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:04 compute-1 NetworkManager[56069]: <info>  [1769552884.1310] manager: (tap7aad95c9-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.133 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.138 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:04 compute-1 nova_compute[183751]: 2026-01-27 22:28:04.140 183755 INFO os_vif [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:eb:30,bridge_name='br-int',has_traffic_filtering=True,id=7aad95c9-8de3-4ca5-a809-4c78b89d323a,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad95c9-8d')
Jan 27 22:28:05 compute-1 podman[193064]: time="2026-01-27T22:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:28:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:28:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Jan 27 22:28:05 compute-1 nova_compute[183751]: 2026-01-27 22:28:05.688 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:28:05 compute-1 nova_compute[183751]: 2026-01-27 22:28:05.689 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:28:05 compute-1 nova_compute[183751]: 2026-01-27 22:28:05.690 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] No VIF found with MAC fa:16:3e:af:eb:30, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:28:05 compute-1 nova_compute[183751]: 2026-01-27 22:28:05.690 183755 INFO nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Using config drive
Jan 27 22:28:06 compute-1 nova_compute[183751]: 2026-01-27 22:28:06.208 183755 WARNING neutronclient.v2_0.client [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:28:06 compute-1 nova_compute[183751]: 2026-01-27 22:28:06.236 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:07 compute-1 nova_compute[183751]: 2026-01-27 22:28:07.648 183755 INFO nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Creating config drive at /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk.config
Jan 27 22:28:07 compute-1 nova_compute[183751]: 2026-01-27 22:28:07.657 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp2u0nk17e execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:28:07 compute-1 nova_compute[183751]: 2026-01-27 22:28:07.790 183755 DEBUG oslo_concurrency.processutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp2u0nk17e" returned: 0 in 0.133s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:28:07 compute-1 kernel: tap7aad95c9-8d: entered promiscuous mode
Jan 27 22:28:07 compute-1 ovn_controller[95969]: 2026-01-27T22:28:07Z|00075|binding|INFO|Claiming lport 7aad95c9-8de3-4ca5-a809-4c78b89d323a for this chassis.
Jan 27 22:28:07 compute-1 ovn_controller[95969]: 2026-01-27T22:28:07Z|00076|binding|INFO|7aad95c9-8de3-4ca5-a809-4c78b89d323a: Claiming fa:16:3e:af:eb:30 10.100.0.14
Jan 27 22:28:07 compute-1 NetworkManager[56069]: <info>  [1769552887.8936] manager: (tap7aad95c9-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Jan 27 22:28:07 compute-1 nova_compute[183751]: 2026-01-27 22:28:07.893 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:07 compute-1 nova_compute[183751]: 2026-01-27 22:28:07.897 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.910 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:eb:30 10.100.0.14'], port_security=['fa:16:3e:af:eb:30 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ff165b27-6b1d-4da4-83de-b6f3a7913776', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a3669bc840040159a7655f1b219810c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3a2e630-65c9-42f0-93ec-c493ad0a7683', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4db3b16f-6743-4c2d-a5f9-9b2b6d3313bd, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=7aad95c9-8de3-4ca5-a809-4c78b89d323a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.912 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 7aad95c9-8de3-4ca5-a809-4c78b89d323a in datapath d5f6cbcb-0015-404b-a15e-d163be3d6b1a bound to our chassis
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.913 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d5f6cbcb-0015-404b-a15e-d163be3d6b1a
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.927 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b520a6aa-c3ad-41f5-85b9-09fe70599eb8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.928 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd5f6cbcb-01 in ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.931 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd5f6cbcb-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.931 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3dbcaf04-2a53-4832-876c-dc7f9d5cc0b2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.931 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[12d42446-e4db-47a7-ac8e-1ae49e6cabe7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.945 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1ef7b7-3d23-4181-8f3d-b1dcb9320669]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:07 compute-1 systemd-udevd[217695]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:28:07 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.967 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[1736b03c-6c9b-41b0-8ecc-8b9455ed74ca]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:07 compute-1 nova_compute[183751]: 2026-01-27 22:28:07.968 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:07 compute-1 ovn_controller[95969]: 2026-01-27T22:28:07Z|00077|binding|INFO|Setting lport 7aad95c9-8de3-4ca5-a809-4c78b89d323a ovn-installed in OVS
Jan 27 22:28:07 compute-1 ovn_controller[95969]: 2026-01-27T22:28:07Z|00078|binding|INFO|Setting lport 7aad95c9-8de3-4ca5-a809-4c78b89d323a up in Southbound
Jan 27 22:28:07 compute-1 nova_compute[183751]: 2026-01-27 22:28:07.975 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:07 compute-1 NetworkManager[56069]: <info>  [1769552887.9837] device (tap7aad95c9-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:28:07 compute-1 NetworkManager[56069]: <info>  [1769552887.9846] device (tap7aad95c9-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:28:07 compute-1 systemd-machined[155034]: New machine qemu-5-instance-0000000a.
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:07.999 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf69a9e-9a11-4a9f-b5b2-d3dbf1bf5830]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-0000000a.
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.005 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1a0fd3-cb1b-4145-a227-37fc62b4204d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 systemd-udevd[217703]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:28:08 compute-1 NetworkManager[56069]: <info>  [1769552888.0078] manager: (tapd5f6cbcb-00): new Veth device (/org/freedesktop/NetworkManager/Devices/38)
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.042 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[f42033d0-35c2-4f5b-93b5-e608a38c3f15]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.045 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[8d251fe1-1fab-4efa-8478-b3931154790d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 podman[217670]: 2026-01-27 22:28:08.060921626 +0000 UTC m=+0.173986117 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 27 22:28:08 compute-1 NetworkManager[56069]: <info>  [1769552888.0743] device (tapd5f6cbcb-00): carrier: link connected
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.081 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[4b097909-2628-47e9-af8f-8b6f04bd330b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.098 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[24bcc7a3-0259-4bba-977f-da830b303fd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f6cbcb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:2b:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903034, 'reachable_time': 19467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217736, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.113 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[84b1208f-e21d-4b8e-94ba-8b64aa6bb26c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:2b21'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 903034, 'tstamp': 903034}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217738, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.130 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ee448e78-f766-4223-8b9c-61bc60f3766a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f6cbcb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:2b:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903034, 'reachable_time': 19467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217739, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.161 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[50b2db63-f33d-4bd7-99dd-94521bae2c2c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.204 183755 DEBUG nova.compute.manager [req-d7f9e38a-8be0-4ab6-851e-da0f1e891ffe req-042ebf57-6373-4ad1-a193-96418aa02b59 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-vif-plugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.204 183755 DEBUG oslo_concurrency.lockutils [req-d7f9e38a-8be0-4ab6-851e-da0f1e891ffe req-042ebf57-6373-4ad1-a193-96418aa02b59 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.204 183755 DEBUG oslo_concurrency.lockutils [req-d7f9e38a-8be0-4ab6-851e-da0f1e891ffe req-042ebf57-6373-4ad1-a193-96418aa02b59 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.204 183755 DEBUG oslo_concurrency.lockutils [req-d7f9e38a-8be0-4ab6-851e-da0f1e891ffe req-042ebf57-6373-4ad1-a193-96418aa02b59 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.205 183755 DEBUG nova.compute.manager [req-d7f9e38a-8be0-4ab6-851e-da0f1e891ffe req-042ebf57-6373-4ad1-a193-96418aa02b59 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Processing event network-vif-plugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.224 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac0849e-f651-4150-8d7a-bb0d818c5eb0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.226 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f6cbcb-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.226 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.227 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5f6cbcb-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.228 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:08 compute-1 kernel: tapd5f6cbcb-00: entered promiscuous mode
Jan 27 22:28:08 compute-1 NetworkManager[56069]: <info>  [1769552888.2307] manager: (tapd5f6cbcb-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.231 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.232 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd5f6cbcb-00, col_values=(('external_ids', {'iface-id': '6e012cf2-42c8-4ee7-a370-6ad1e6e56131'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.234 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:08 compute-1 ovn_controller[95969]: 2026-01-27T22:28:08Z|00079|binding|INFO|Releasing lport 6e012cf2-42c8-4ee7-a370-6ad1e6e56131 from this chassis (sb_readonly=0)
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.234 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.236 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[78b2708e-ceb2-4730-86ad-0d2de03964ea]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.236 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.236 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.236 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d5f6cbcb-0015-404b-a15e-d163be3d6b1a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.236 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.237 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f010ee-d591-45e3-a25f-0ed58340e5ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.237 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.237 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[8613afeb-fd3f-461a-968e-2764d138f3d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.238 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-d5f6cbcb-0015-404b-a15e-d163be3d6b1a
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID d5f6cbcb-0015-404b-a15e-d163be3d6b1a
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.238 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'env', 'PROCESS_TAG=haproxy-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.245 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.494 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.500 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.505 183755 INFO nova.virt.libvirt.driver [-] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Instance spawned successfully.
Jan 27 22:28:08 compute-1 nova_compute[183751]: 2026-01-27 22:28:08.505 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:28:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:08.596 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:08 compute-1 podman[217778]: 2026-01-27 22:28:08.673025038 +0000 UTC m=+0.077779316 container create cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 22:28:08 compute-1 podman[217778]: 2026-01-27 22:28:08.636574956 +0000 UTC m=+0.041329334 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:28:08 compute-1 systemd[1]: Started libpod-conmon-cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9.scope.
Jan 27 22:28:08 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:28:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51898f55874a72eb58422091f6059b0e5ec658a8430143373df81b27f10d969b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:28:08 compute-1 podman[217778]: 2026-01-27 22:28:08.787330307 +0000 UTC m=+0.192084585 container init cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 27 22:28:08 compute-1 podman[217778]: 2026-01-27 22:28:08.794359171 +0000 UTC m=+0.199113449 container start cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:28:08 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[217794]: [NOTICE]   (217798) : New worker (217800) forked
Jan 27 22:28:08 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[217794]: [NOTICE]   (217798) : Loading success.
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.020 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.021 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.021 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.021 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.021 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.022 183755 DEBUG nova.virt.libvirt.driver [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.130 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.534 183755 INFO nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Took 11.19 seconds to spawn the instance on the hypervisor.
Jan 27 22:28:09 compute-1 nova_compute[183751]: 2026-01-27 22:28:09.535 183755 DEBUG nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.065 183755 INFO nova.compute.manager [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Took 16.57 seconds to build instance.
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.329 183755 DEBUG nova.compute.manager [req-897f0301-4432-4419-aa06-1c1758d1ea99 req-bc6fa872-28cb-473c-b8c5-39e627c9c390 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-vif-plugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.329 183755 DEBUG oslo_concurrency.lockutils [req-897f0301-4432-4419-aa06-1c1758d1ea99 req-bc6fa872-28cb-473c-b8c5-39e627c9c390 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.329 183755 DEBUG oslo_concurrency.lockutils [req-897f0301-4432-4419-aa06-1c1758d1ea99 req-bc6fa872-28cb-473c-b8c5-39e627c9c390 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.330 183755 DEBUG oslo_concurrency.lockutils [req-897f0301-4432-4419-aa06-1c1758d1ea99 req-bc6fa872-28cb-473c-b8c5-39e627c9c390 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.330 183755 DEBUG nova.compute.manager [req-897f0301-4432-4419-aa06-1c1758d1ea99 req-bc6fa872-28cb-473c-b8c5-39e627c9c390 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] No waiting events found dispatching network-vif-plugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.330 183755 WARNING nova.compute.manager [req-897f0301-4432-4419-aa06-1c1758d1ea99 req-bc6fa872-28cb-473c-b8c5-39e627c9c390 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received unexpected event network-vif-plugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a for instance with vm_state active and task_state None.
Jan 27 22:28:10 compute-1 nova_compute[183751]: 2026-01-27 22:28:10.570 183755 DEBUG oslo_concurrency.lockutils [None req-4c31aad3-4673-48f9-9381-b07220bcd382 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.219s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:11 compute-1 nova_compute[183751]: 2026-01-27 22:28:11.239 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:11.274 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:11.274 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:11.275 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:13 compute-1 podman[217811]: 2026-01-27 22:28:13.811593399 +0000 UTC m=+0.112137497 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:28:13 compute-1 podman[217810]: 2026-01-27 22:28:13.813779673 +0000 UTC m=+0.118540855 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Jan 27 22:28:13 compute-1 nova_compute[183751]: 2026-01-27 22:28:13.856 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:13 compute-1 nova_compute[183751]: 2026-01-27 22:28:13.856 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:13 compute-1 nova_compute[183751]: 2026-01-27 22:28:13.856 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:13 compute-1 nova_compute[183751]: 2026-01-27 22:28:13.856 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:13 compute-1 nova_compute[183751]: 2026-01-27 22:28:13.857 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:13 compute-1 nova_compute[183751]: 2026-01-27 22:28:13.872 183755 INFO nova.compute.manager [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Terminating instance
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.132 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.392 183755 DEBUG nova.compute.manager [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:28:14 compute-1 kernel: tap7aad95c9-8d (unregistering): left promiscuous mode
Jan 27 22:28:14 compute-1 NetworkManager[56069]: <info>  [1769552894.4145] device (tap7aad95c9-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:28:14 compute-1 ovn_controller[95969]: 2026-01-27T22:28:14Z|00080|binding|INFO|Releasing lport 7aad95c9-8de3-4ca5-a809-4c78b89d323a from this chassis (sb_readonly=0)
Jan 27 22:28:14 compute-1 ovn_controller[95969]: 2026-01-27T22:28:14Z|00081|binding|INFO|Setting lport 7aad95c9-8de3-4ca5-a809-4c78b89d323a down in Southbound
Jan 27 22:28:14 compute-1 ovn_controller[95969]: 2026-01-27T22:28:14Z|00082|binding|INFO|Removing iface tap7aad95c9-8d ovn-installed in OVS
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.425 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.428 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.437 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:eb:30 10.100.0.14'], port_security=['fa:16:3e:af:eb:30 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ff165b27-6b1d-4da4-83de-b6f3a7913776', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a3669bc840040159a7655f1b219810c', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f3a2e630-65c9-42f0-93ec-c493ad0a7683', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4db3b16f-6743-4c2d-a5f9-9b2b6d3313bd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=7aad95c9-8de3-4ca5-a809-4c78b89d323a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.438 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 7aad95c9-8de3-4ca5-a809-4c78b89d323a in datapath d5f6cbcb-0015-404b-a15e-d163be3d6b1a unbound from our chassis
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.440 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f6cbcb-0015-404b-a15e-d163be3d6b1a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.441 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c28d2f72-6c1a-4531-9c25-e04dc01ed3e9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.441 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a namespace which is not needed anymore
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.458 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 27 22:28:14 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Consumed 6.437s CPU time.
Jan 27 22:28:14 compute-1 systemd-machined[155034]: Machine qemu-5-instance-0000000a terminated.
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.607 183755 DEBUG nova.compute.manager [req-3a2a40fc-6046-4d20-9b18-5bb381d84f29 req-8d96a318-4c8d-4e5d-8441-bb24f8fa866f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-vif-unplugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.607 183755 DEBUG oslo_concurrency.lockutils [req-3a2a40fc-6046-4d20-9b18-5bb381d84f29 req-8d96a318-4c8d-4e5d-8441-bb24f8fa866f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.607 183755 DEBUG oslo_concurrency.lockutils [req-3a2a40fc-6046-4d20-9b18-5bb381d84f29 req-8d96a318-4c8d-4e5d-8441-bb24f8fa866f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.608 183755 DEBUG oslo_concurrency.lockutils [req-3a2a40fc-6046-4d20-9b18-5bb381d84f29 req-8d96a318-4c8d-4e5d-8441-bb24f8fa866f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.608 183755 DEBUG nova.compute.manager [req-3a2a40fc-6046-4d20-9b18-5bb381d84f29 req-8d96a318-4c8d-4e5d-8441-bb24f8fa866f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] No waiting events found dispatching network-vif-unplugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.608 183755 DEBUG nova.compute.manager [req-3a2a40fc-6046-4d20-9b18-5bb381d84f29 req-8d96a318-4c8d-4e5d-8441-bb24f8fa866f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-vif-unplugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.621 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[217794]: [NOTICE]   (217798) : haproxy version is 3.0.5-8e879a5
Jan 27 22:28:14 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[217794]: [NOTICE]   (217798) : path to executable is /usr/sbin/haproxy
Jan 27 22:28:14 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[217794]: [WARNING]  (217798) : Exiting Master process...
Jan 27 22:28:14 compute-1 podman[217874]: 2026-01-27 22:28:14.625717871 +0000 UTC m=+0.047759733 container kill cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:28:14 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[217794]: [ALERT]    (217798) : Current worker (217800) exited with code 143 (Terminated)
Jan 27 22:28:14 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[217794]: [WARNING]  (217798) : All workers exited. Exiting... (0)
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.631 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 systemd[1]: libpod-cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9.scope: Deactivated successfully.
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.672 183755 INFO nova.virt.libvirt.driver [-] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Instance destroyed successfully.
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.673 183755 DEBUG nova.objects.instance [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lazy-loading 'resources' on Instance uuid ff165b27-6b1d-4da4-83de-b6f3a7913776 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:28:14 compute-1 podman[217898]: 2026-01-27 22:28:14.685210304 +0000 UTC m=+0.034450244 container died cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 27 22:28:14 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9-userdata-shm.mount: Deactivated successfully.
Jan 27 22:28:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-51898f55874a72eb58422091f6059b0e5ec658a8430143373df81b27f10d969b-merged.mount: Deactivated successfully.
Jan 27 22:28:14 compute-1 podman[217898]: 2026-01-27 22:28:14.733357065 +0000 UTC m=+0.082596965 container cleanup cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Jan 27 22:28:14 compute-1 systemd[1]: libpod-conmon-cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9.scope: Deactivated successfully.
Jan 27 22:28:14 compute-1 podman[217902]: 2026-01-27 22:28:14.749342311 +0000 UTC m=+0.076739720 container remove cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS)
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.757 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7c89182b-604c-498e-9936-1fa992390ba8]: (4, ("Tue Jan 27 10:28:14 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a (cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9)\ncd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9\nTue Jan 27 10:28:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a (cd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9)\ncd5e90f789c63baa44c0470d1f3b4f279181de7caae203a7c8cabca86404bce9\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.759 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd0f23b-d869-45d9-8d7d-01e08d64c32a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.760 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.760 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a436f0be-9830-449b-afcd-ed9993ec9382]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.761 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f6cbcb-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.763 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 kernel: tapd5f6cbcb-00: left promiscuous mode
Jan 27 22:28:14 compute-1 nova_compute[183751]: 2026-01-27 22:28:14.790 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.794 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[16fb48e3-5801-4eea-87c0-64229f78000d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.809 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[4d883038-2a99-4dc9-a656-e63a5465ee73]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.810 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[17c0474a-99ad-424a-9ec7-a22722f0e098]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.829 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5afb94-c57b-42b9-8407-54e6c63cb521]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 903026, 'reachable_time': 24236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217939, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.832 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:28:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:28:14.833 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[11b3d930-8f7b-43ef-abb8-44806004d89d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:28:14 compute-1 systemd[1]: run-netns-ovnmeta\x2dd5f6cbcb\x2d0015\x2d404b\x2da15e\x2dd163be3d6b1a.mount: Deactivated successfully.
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.181 183755 DEBUG nova.virt.libvirt.vif [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:27:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1588832548',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-158',id=10,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:28:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a3669bc840040159a7655f1b219810c',ramdisk_id='',reservation_id='r-oht7c44j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:28:09Z,user_data=None,user_id='4b9b219e067b4a669e9564e586cb41cd',uuid=ff165b27-6b1d-4da4-83de-b6f3a7913776,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.182 183755 DEBUG nova.network.os_vif_util [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converting VIF {"id": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "address": "fa:16:3e:af:eb:30", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aad95c9-8d", "ovs_interfaceid": "7aad95c9-8de3-4ca5-a809-4c78b89d323a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.183 183755 DEBUG nova.network.os_vif_util [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:eb:30,bridge_name='br-int',has_traffic_filtering=True,id=7aad95c9-8de3-4ca5-a809-4c78b89d323a,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad95c9-8d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.183 183755 DEBUG os_vif [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:eb:30,bridge_name='br-int',has_traffic_filtering=True,id=7aad95c9-8de3-4ca5-a809-4c78b89d323a,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad95c9-8d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.186 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.186 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7aad95c9-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.188 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.190 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.191 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.192 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c9222467-586c-4af4-94d0-b5d569aec7aa) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.193 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.194 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.197 183755 INFO os_vif [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:eb:30,bridge_name='br-int',has_traffic_filtering=True,id=7aad95c9-8de3-4ca5-a809-4c78b89d323a,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aad95c9-8d')
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.198 183755 INFO nova.virt.libvirt.driver [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Deleting instance files /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776_del
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.199 183755 INFO nova.virt.libvirt.driver [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Deletion of /var/lib/nova/instances/ff165b27-6b1d-4da4-83de-b6f3a7913776_del complete
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.714 183755 INFO nova.compute.manager [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.715 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.715 183755 DEBUG nova.compute.manager [-] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.715 183755 DEBUG nova.network.neutron [-] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:28:15 compute-1 nova_compute[183751]: 2026-01-27 22:28:15.716 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.241 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.302 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.708 183755 DEBUG nova.compute.manager [req-f48c05aa-fd31-42a8-ac8e-abab53c33d3a req-ecc977a5-1c07-490b-8e21-520d9dc7f0ed 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-vif-unplugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.709 183755 DEBUG oslo_concurrency.lockutils [req-f48c05aa-fd31-42a8-ac8e-abab53c33d3a req-ecc977a5-1c07-490b-8e21-520d9dc7f0ed 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.709 183755 DEBUG oslo_concurrency.lockutils [req-f48c05aa-fd31-42a8-ac8e-abab53c33d3a req-ecc977a5-1c07-490b-8e21-520d9dc7f0ed 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.710 183755 DEBUG oslo_concurrency.lockutils [req-f48c05aa-fd31-42a8-ac8e-abab53c33d3a req-ecc977a5-1c07-490b-8e21-520d9dc7f0ed 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.710 183755 DEBUG nova.compute.manager [req-f48c05aa-fd31-42a8-ac8e-abab53c33d3a req-ecc977a5-1c07-490b-8e21-520d9dc7f0ed 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] No waiting events found dispatching network-vif-unplugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:28:16 compute-1 nova_compute[183751]: 2026-01-27 22:28:16.710 183755 DEBUG nova.compute.manager [req-f48c05aa-fd31-42a8-ac8e-abab53c33d3a req-ecc977a5-1c07-490b-8e21-520d9dc7f0ed 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-vif-unplugged-7aad95c9-8de3-4ca5-a809-4c78b89d323a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:28:17 compute-1 nova_compute[183751]: 2026-01-27 22:28:17.073 183755 DEBUG nova.compute.manager [req-1ee5054a-66df-4203-bdf6-c3002097ff71 req-0275e4ed-daa4-4835-8cea-66d03eae3b18 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Received event network-vif-deleted-7aad95c9-8de3-4ca5-a809-4c78b89d323a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:28:17 compute-1 nova_compute[183751]: 2026-01-27 22:28:17.073 183755 INFO nova.compute.manager [req-1ee5054a-66df-4203-bdf6-c3002097ff71 req-0275e4ed-daa4-4835-8cea-66d03eae3b18 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Neutron deleted interface 7aad95c9-8de3-4ca5-a809-4c78b89d323a; detaching it from the instance and deleting it from the info cache
Jan 27 22:28:17 compute-1 nova_compute[183751]: 2026-01-27 22:28:17.073 183755 DEBUG nova.network.neutron [req-1ee5054a-66df-4203-bdf6-c3002097ff71 req-0275e4ed-daa4-4835-8cea-66d03eae3b18 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:28:17 compute-1 nova_compute[183751]: 2026-01-27 22:28:17.236 183755 DEBUG nova.network.neutron [-] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:28:17 compute-1 nova_compute[183751]: 2026-01-27 22:28:17.582 183755 DEBUG nova.compute.manager [req-1ee5054a-66df-4203-bdf6-c3002097ff71 req-0275e4ed-daa4-4835-8cea-66d03eae3b18 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Detach interface failed, port_id=7aad95c9-8de3-4ca5-a809-4c78b89d323a, reason: Instance ff165b27-6b1d-4da4-83de-b6f3a7913776 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 27 22:28:17 compute-1 nova_compute[183751]: 2026-01-27 22:28:17.748 183755 INFO nova.compute.manager [-] [instance: ff165b27-6b1d-4da4-83de-b6f3a7913776] Took 2.03 seconds to deallocate network for instance.
Jan 27 22:28:18 compute-1 nova_compute[183751]: 2026-01-27 22:28:18.272 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:18 compute-1 nova_compute[183751]: 2026-01-27 22:28:18.273 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:18 compute-1 nova_compute[183751]: 2026-01-27 22:28:18.355 183755 DEBUG nova.compute.provider_tree [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:28:18 compute-1 nova_compute[183751]: 2026-01-27 22:28:18.863 183755 DEBUG nova.scheduler.client.report [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:28:19 compute-1 nova_compute[183751]: 2026-01-27 22:28:19.374 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:19 compute-1 nova_compute[183751]: 2026-01-27 22:28:19.415 183755 INFO nova.scheduler.client.report [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Deleted allocations for instance ff165b27-6b1d-4da4-83de-b6f3a7913776
Jan 27 22:28:19 compute-1 openstack_network_exporter[195945]: ERROR   22:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:28:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:28:19 compute-1 openstack_network_exporter[195945]: ERROR   22:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:28:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:28:20 compute-1 nova_compute[183751]: 2026-01-27 22:28:20.194 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:20 compute-1 nova_compute[183751]: 2026-01-27 22:28:20.458 183755 DEBUG oslo_concurrency.lockutils [None req-33a4ad76-391f-4070-b439-a61f28a850c3 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "ff165b27-6b1d-4da4-83de-b6f3a7913776" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.602s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:21 compute-1 nova_compute[183751]: 2026-01-27 22:28:21.243 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:22 compute-1 podman[217940]: 2026-01-27 22:28:22.751554617 +0000 UTC m=+0.063982935 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:28:25 compute-1 nova_compute[183751]: 2026-01-27 22:28:25.196 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:26 compute-1 nova_compute[183751]: 2026-01-27 22:28:26.145 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:26 compute-1 nova_compute[183751]: 2026-01-27 22:28:26.272 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:30 compute-1 nova_compute[183751]: 2026-01-27 22:28:30.197 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:31 compute-1 nova_compute[183751]: 2026-01-27 22:28:31.276 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:35 compute-1 nova_compute[183751]: 2026-01-27 22:28:35.272 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:35 compute-1 podman[193064]: time="2026-01-27T22:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:28:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:28:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2184 "" "Go-http-client/1.1"
Jan 27 22:28:36 compute-1 nova_compute[183751]: 2026-01-27 22:28:36.317 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:38 compute-1 podman[217966]: 2026-01-27 22:28:38.844407378 +0000 UTC m=+0.141955105 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 22:28:40 compute-1 nova_compute[183751]: 2026-01-27 22:28:40.274 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:41 compute-1 nova_compute[183751]: 2026-01-27 22:28:41.319 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:44 compute-1 podman[217993]: 2026-01-27 22:28:44.801774648 +0000 UTC m=+0.095717170 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126)
Jan 27 22:28:44 compute-1 podman[217992]: 2026-01-27 22:28:44.8014513 +0000 UTC m=+0.106539737 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, config_id=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350)
Jan 27 22:28:45 compute-1 nova_compute[183751]: 2026-01-27 22:28:45.276 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:45 compute-1 sshd-session[218032]: Invalid user lighthouse from 80.94.92.186 port 60368
Jan 27 22:28:45 compute-1 sshd-session[218032]: Connection closed by invalid user lighthouse 80.94.92.186 port 60368 [preauth]
Jan 27 22:28:46 compute-1 nova_compute[183751]: 2026-01-27 22:28:46.322 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:49 compute-1 openstack_network_exporter[195945]: ERROR   22:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:28:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:28:49 compute-1 openstack_network_exporter[195945]: ERROR   22:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:28:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:28:50 compute-1 nova_compute[183751]: 2026-01-27 22:28:50.279 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:51 compute-1 nova_compute[183751]: 2026-01-27 22:28:51.358 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:53 compute-1 podman[218034]: 2026-01-27 22:28:53.77680797 +0000 UTC m=+0.077258339 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:28:54 compute-1 nova_compute[183751]: 2026-01-27 22:28:54.418 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:54 compute-1 nova_compute[183751]: 2026-01-27 22:28:54.418 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:54 compute-1 nova_compute[183751]: 2026-01-27 22:28:54.926 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:28:55 compute-1 nova_compute[183751]: 2026-01-27 22:28:55.280 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:55 compute-1 nova_compute[183751]: 2026-01-27 22:28:55.498 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:55 compute-1 nova_compute[183751]: 2026-01-27 22:28:55.499 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:55 compute-1 nova_compute[183751]: 2026-01-27 22:28:55.510 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:28:55 compute-1 nova_compute[183751]: 2026-01-27 22:28:55.511 183755 INFO nova.compute.claims [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:28:56 compute-1 nova_compute[183751]: 2026-01-27 22:28:56.392 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:28:56 compute-1 nova_compute[183751]: 2026-01-27 22:28:56.589 183755 DEBUG nova.compute.provider_tree [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:28:57 compute-1 nova_compute[183751]: 2026-01-27 22:28:57.098 183755 DEBUG nova.scheduler.client.report [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:28:57 compute-1 nova_compute[183751]: 2026-01-27 22:28:57.612 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:57 compute-1 nova_compute[183751]: 2026-01-27 22:28:57.613 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.124 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.125 183755 DEBUG nova.network.neutron [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.126 183755 WARNING neutronclient.v2_0.client [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.127 183755 WARNING neutronclient.v2_0.client [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.807 183755 INFO nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.811 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.811 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.812 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.812 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.966 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:28:58 compute-1 nova_compute[183751]: 2026-01-27 22:28:58.968 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:28:59 compute-1 nova_compute[183751]: 2026-01-27 22:28:59.007 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:28:59 compute-1 nova_compute[183751]: 2026-01-27 22:28:59.008 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5863MB free_disk=73.14262771606445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:28:59 compute-1 nova_compute[183751]: 2026-01-27 22:28:59.009 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:28:59 compute-1 nova_compute[183751]: 2026-01-27 22:28:59.009 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:28:59 compute-1 nova_compute[183751]: 2026-01-27 22:28:59.320 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:28:59 compute-1 nova_compute[183751]: 2026-01-27 22:28:59.427 183755 DEBUG nova.network.neutron [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Successfully created port: 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.062 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Instance 3b489138-954b-41f7-ad29-32ba9abe9bc1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.062 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.063 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:28:58 up  2:31,  0 user,  load average: 0.23, 0.25, 0.14\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_1a3669bc840040159a7655f1b219810c': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.190 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.283 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.347 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.349 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.350 183755 INFO nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Creating image(s)
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.351 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "/var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.351 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "/var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.353 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "/var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.354 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.360 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.362 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.446 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.448 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.449 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.450 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.452 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.453 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.519 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.521 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.562 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.564 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.564 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.630 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.632 183755 DEBUG nova.virt.disk.api [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Checking if we can resize image /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.633 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.697 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.699 183755 DEBUG nova.virt.disk.api [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Cannot resize image /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.700 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.700 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Ensure instance console log exists: /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.701 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.702 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.703 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:00 compute-1 nova_compute[183751]: 2026-01-27 22:29:00.705 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.222 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.223 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.213s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.404 183755 DEBUG nova.network.neutron [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Successfully updated port: 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.443 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.478 183755 DEBUG nova.compute.manager [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-changed-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.478 183755 DEBUG nova.compute.manager [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Refreshing instance network info cache due to event network-changed-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.479 183755 DEBUG oslo_concurrency.lockutils [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-3b489138-954b-41f7-ad29-32ba9abe9bc1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.479 183755 DEBUG oslo_concurrency.lockutils [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-3b489138-954b-41f7-ad29-32ba9abe9bc1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.480 183755 DEBUG nova.network.neutron [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Refreshing network info cache for port 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:29:01 compute-1 nova_compute[183751]: 2026-01-27 22:29:01.912 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "refresh_cache-3b489138-954b-41f7-ad29-32ba9abe9bc1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.028 183755 WARNING neutronclient.v2_0.client [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.223 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.223 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.224 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.224 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.224 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.336 183755 DEBUG nova.network.neutron [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:29:02 compute-1 nova_compute[183751]: 2026-01-27 22:29:02.536 183755 DEBUG nova.network.neutron [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:29:03 compute-1 nova_compute[183751]: 2026-01-27 22:29:03.047 183755 DEBUG oslo_concurrency.lockutils [req-98354256-3813-43ba-8835-c731417fc15e req-bfa1bea4-24b5-43bb-b0b5-63e4ae0f9459 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-3b489138-954b-41f7-ad29-32ba9abe9bc1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:29:03 compute-1 nova_compute[183751]: 2026-01-27 22:29:03.048 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquired lock "refresh_cache-3b489138-954b-41f7-ad29-32ba9abe9bc1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:29:03 compute-1 nova_compute[183751]: 2026-01-27 22:29:03.048 183755 DEBUG nova.network.neutron [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:29:03 compute-1 nova_compute[183751]: 2026-01-27 22:29:03.721 183755 DEBUG nova.network.neutron [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.041 183755 WARNING neutronclient.v2_0.client [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.147 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.472 183755 DEBUG nova.network.neutron [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Updating instance_info_cache with network_info: [{"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.981 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Releasing lock "refresh_cache-3b489138-954b-41f7-ad29-32ba9abe9bc1" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.982 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Instance network_info: |[{"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.986 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Start _get_guest_xml network_info=[{"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.990 183755 WARNING nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.992 183755 DEBUG nova.virt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1121773310', uuid='3b489138-954b-41f7-ad29-32ba9abe9bc1'), owner=OwnerMeta(userid='4b9b219e067b4a669e9564e586cb41cd', username='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin', projectid='1a3669bc840040159a7655f1b219810c', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769552944.9921334) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.996 183755 DEBUG nova.virt.libvirt.host [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:29:04 compute-1 nova_compute[183751]: 2026-01-27 22:29:04.997 183755 DEBUG nova.virt.libvirt.host [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.000 183755 DEBUG nova.virt.libvirt.host [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.001 183755 DEBUG nova.virt.libvirt.host [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.002 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.003 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.003 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.004 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.004 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.005 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.005 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.005 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.006 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.006 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.006 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.007 183755 DEBUG nova.virt.hardware [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.012 183755 DEBUG nova.virt.libvirt.vif [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:28:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1121773310',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-112',id=12,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a3669bc840040159a7655f1b219810c',ramdisk_id='',reservation_id='r-f00k0lr8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:28:59Z,user_data=None,user_id='4b9b219e067b4a669e9564e586cb41cd',uuid=3b489138-954b-41f7-ad29-32ba9abe9bc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.013 183755 DEBUG nova.network.os_vif_util [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converting VIF {"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.014 183755 DEBUG nova.network.os_vif_util [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:64:53,bridge_name='br-int',has_traffic_filtering=True,id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a73a1cc-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.015 183755 DEBUG nova.objects.instance [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b489138-954b-41f7-ad29-32ba9abe9bc1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.285 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.525 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <uuid>3b489138-954b-41f7-ad29-32ba9abe9bc1</uuid>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <name>instance-0000000c</name>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1121773310</nova:name>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:29:04</nova:creationTime>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:29:05 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:29:05 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:user uuid="4b9b219e067b4a669e9564e586cb41cd">tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin</nova:user>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:project uuid="1a3669bc840040159a7655f1b219810c">tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885</nova:project>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         <nova:port uuid="0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2">
Jan 27 22:29:05 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <system>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <entry name="serial">3b489138-954b-41f7-ad29-32ba9abe9bc1</entry>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <entry name="uuid">3b489138-954b-41f7-ad29-32ba9abe9bc1</entry>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </system>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <os>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   </os>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <features>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   </features>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk.config"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:79:64:53"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <target dev="tap0a73a1cc-3c"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/console.log" append="off"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <video>
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </video>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:29:05 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:29:05 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:29:05 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:29:05 compute-1 nova_compute[183751]: </domain>
Jan 27 22:29:05 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.527 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Preparing to wait for external event network-vif-plugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.527 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.528 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.528 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.529 183755 DEBUG nova.virt.libvirt.vif [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:28:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1121773310',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-112',id=12,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a3669bc840040159a7655f1b219810c',ramdisk_id='',reservation_id='r-f00k0lr8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:28:59Z,user_data=None,user_id='4b9b219e067b4a669e9564e586cb41cd',uuid=3b489138-954b-41f7-ad29-32ba9abe9bc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.530 183755 DEBUG nova.network.os_vif_util [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converting VIF {"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.531 183755 DEBUG nova.network.os_vif_util [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:64:53,bridge_name='br-int',has_traffic_filtering=True,id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a73a1cc-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.531 183755 DEBUG os_vif [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:64:53,bridge_name='br-int',has_traffic_filtering=True,id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a73a1cc-3c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.532 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.533 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.534 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.535 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.535 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5632ef72-df07-59ca-8dd6-1a71839f6386', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.537 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.539 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.549 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.549 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a73a1cc-3c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.550 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap0a73a1cc-3c, col_values=(('qos', UUID('c82d2ff5-39e1-4e19-a216-0fa27c299a6f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.550 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap0a73a1cc-3c, col_values=(('external_ids', {'iface-id': '0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:79:64:53', 'vm-uuid': '3b489138-954b-41f7-ad29-32ba9abe9bc1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.551 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 NetworkManager[56069]: <info>  [1769552945.5528] manager: (tap0a73a1cc-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.554 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.557 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:05 compute-1 nova_compute[183751]: 2026-01-27 22:29:05.558 183755 INFO os_vif [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:64:53,bridge_name='br-int',has_traffic_filtering=True,id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a73a1cc-3c')
Jan 27 22:29:05 compute-1 podman[193064]: time="2026-01-27T22:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:29:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:29:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 27 22:29:06 compute-1 nova_compute[183751]: 2026-01-27 22:29:06.489 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:07 compute-1 nova_compute[183751]: 2026-01-27 22:29:07.117 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:29:07 compute-1 nova_compute[183751]: 2026-01-27 22:29:07.118 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:29:07 compute-1 nova_compute[183751]: 2026-01-27 22:29:07.118 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] No VIF found with MAC fa:16:3e:79:64:53, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:29:07 compute-1 nova_compute[183751]: 2026-01-27 22:29:07.119 183755 INFO nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Using config drive
Jan 27 22:29:07 compute-1 nova_compute[183751]: 2026-01-27 22:29:07.633 183755 WARNING neutronclient.v2_0.client [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.338 183755 INFO nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Creating config drive at /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk.config
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.348 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp4qu0035x execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.484 183755 DEBUG oslo_concurrency.processutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp4qu0035x" returned: 0 in 0.137s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:29:08 compute-1 kernel: tap0a73a1cc-3c: entered promiscuous mode
Jan 27 22:29:08 compute-1 NetworkManager[56069]: <info>  [1769552948.5758] manager: (tap0a73a1cc-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.575 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:08 compute-1 ovn_controller[95969]: 2026-01-27T22:29:08Z|00083|binding|INFO|Claiming lport 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 for this chassis.
Jan 27 22:29:08 compute-1 ovn_controller[95969]: 2026-01-27T22:29:08Z|00084|binding|INFO|0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2: Claiming fa:16:3e:79:64:53 10.100.0.13
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.590 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:64:53 10.100.0.13'], port_security=['fa:16:3e:79:64:53 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b489138-954b-41f7-ad29-32ba9abe9bc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a3669bc840040159a7655f1b219810c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f3a2e630-65c9-42f0-93ec-c493ad0a7683', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4db3b16f-6743-4c2d-a5f9-9b2b6d3313bd, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.591 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 in datapath d5f6cbcb-0015-404b-a15e-d163be3d6b1a bound to our chassis
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.593 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d5f6cbcb-0015-404b-a15e-d163be3d6b1a
Jan 27 22:29:08 compute-1 ovn_controller[95969]: 2026-01-27T22:29:08Z|00085|binding|INFO|Setting lport 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 ovn-installed in OVS
Jan 27 22:29:08 compute-1 ovn_controller[95969]: 2026-01-27T22:29:08Z|00086|binding|INFO|Setting lport 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 up in Southbound
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.607 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.612 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.615 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a4493bf6-b725-49cd-afac-83e4b290e623]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.616 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd5f6cbcb-01 in ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.619 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd5f6cbcb-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.619 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[9989aead-786c-4f26-a324-b1bb0d802895]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.622 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[950815df-910d-4796-af0e-f8dfe85db52d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 systemd-udevd[218099]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.641 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[23dba9ff-e36e-42ed-b5c6-5b3319964134]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 systemd-machined[155034]: New machine qemu-6-instance-0000000c.
Jan 27 22:29:08 compute-1 systemd[1]: Started Virtual Machine qemu-6-instance-0000000c.
Jan 27 22:29:08 compute-1 NetworkManager[56069]: <info>  [1769552948.6599] device (tap0a73a1cc-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:29:08 compute-1 NetworkManager[56069]: <info>  [1769552948.6620] device (tap0a73a1cc-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.661 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8441f9-2d62-45c0-851c-035518c816c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.712 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[171308e8-a204-45d7-8470-1c1d613c9057]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.718 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[19c9f764-a360-4e17-abbd-182347e9e4d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 systemd-udevd[218103]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:29:08 compute-1 NetworkManager[56069]: <info>  [1769552948.7206] manager: (tapd5f6cbcb-00): new Veth device (/org/freedesktop/NetworkManager/Devices/42)
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.770 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[57e448ef-fcc2-4ec4-8767-99eac7c2b5d9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.774 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ed4fb3-0a46-496e-8832-e2c7e874c102]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 NetworkManager[56069]: <info>  [1769552948.8094] device (tapd5f6cbcb-00): carrier: link connected
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.818 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[440e34aa-8cc0-4d6e-b84b-5312c14fc7ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.843 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[28fc25ee-3089-4ef7-baaf-c7d035896527]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f6cbcb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:2b:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 909107, 'reachable_time': 27137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218131, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.854 183755 DEBUG nova.compute.manager [req-790c7849-413d-44b4-a546-6528bc80fe07 req-c6243ba2-126b-444a-a113-3865e1c79217 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-vif-plugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.855 183755 DEBUG oslo_concurrency.lockutils [req-790c7849-413d-44b4-a546-6528bc80fe07 req-c6243ba2-126b-444a-a113-3865e1c79217 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.855 183755 DEBUG oslo_concurrency.lockutils [req-790c7849-413d-44b4-a546-6528bc80fe07 req-c6243ba2-126b-444a-a113-3865e1c79217 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.856 183755 DEBUG oslo_concurrency.lockutils [req-790c7849-413d-44b4-a546-6528bc80fe07 req-c6243ba2-126b-444a-a113-3865e1c79217 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:08 compute-1 nova_compute[183751]: 2026-01-27 22:29:08.856 183755 DEBUG nova.compute.manager [req-790c7849-413d-44b4-a546-6528bc80fe07 req-c6243ba2-126b-444a-a113-3865e1c79217 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Processing event network-vif-plugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.868 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[494b2155-b6c7-4dc1-868f-d55c170c560a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:2b21'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 909107, 'tstamp': 909107}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218132, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.893 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[db4d6014-e307-4be7-8fe7-22cf48947290]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5f6cbcb-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:2b:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 909107, 'reachable_time': 27137, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218133, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:08 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:08.938 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[1270fe75-3ca4-4d82-80fc-e5c24054fbdd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.026 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7e16bbe0-7a51-4dff-b64f-0ccd5ca52a7c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.027 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f6cbcb-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.027 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.028 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5f6cbcb-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.030 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:09 compute-1 NetworkManager[56069]: <info>  [1769552949.0314] manager: (tapd5f6cbcb-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 27 22:29:09 compute-1 kernel: tapd5f6cbcb-00: entered promiscuous mode
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.035 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.044 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd5f6cbcb-00, col_values=(('external_ids', {'iface-id': '6e012cf2-42c8-4ee7-a370-6ad1e6e56131'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.045 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.046 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:09 compute-1 ovn_controller[95969]: 2026-01-27T22:29:09Z|00087|binding|INFO|Releasing lport 6e012cf2-42c8-4ee7-a370-6ad1e6e56131 from this chassis (sb_readonly=0)
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.058 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[56c0773d-6db1-48b6-81ab-c5c6b46ce63b]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.059 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.059 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.059 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for d5f6cbcb-0015-404b-a15e-d163be3d6b1a disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.059 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.060 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[83fd8d79-925a-4f23-a203-85cb313e4882]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.060 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.060 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd7723f-b65a-455b-b34f-f9e9031aefa2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.061 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-d5f6cbcb-0015-404b-a15e-d163be3d6b1a
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID d5f6cbcb-0015-404b-a15e-d163be3d6b1a
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.061 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'env', 'PROCESS_TAG=haproxy-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.079 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.266 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.271 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.274 183755 INFO nova.virt.libvirt.driver [-] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Instance spawned successfully.
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.275 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.407 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.408 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:09 compute-1 podman[218172]: 2026-01-27 22:29:09.593116016 +0000 UTC m=+0.075876296 container create f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:29:09 compute-1 podman[218172]: 2026-01-27 22:29:09.554679376 +0000 UTC m=+0.037439586 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:29:09 compute-1 systemd[1]: Started libpod-conmon-f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6.scope.
Jan 27 22:29:09 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:29:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bad692c5f86c2e1f32f7d5776008d8c60d8a318b4574bef45487b6d8e01d39d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:29:09 compute-1 podman[218172]: 2026-01-27 22:29:09.716248147 +0000 UTC m=+0.199008327 container init f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126)
Jan 27 22:29:09 compute-1 podman[218172]: 2026-01-27 22:29:09.728301445 +0000 UTC m=+0.211061605 container start f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 27 22:29:09 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[218188]: [NOTICE]   (218208) : New worker (218215) forked
Jan 27 22:29:09 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[218188]: [NOTICE]   (218208) : Loading success.
Jan 27 22:29:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:09.788 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.793 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.794 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.795 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.796 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.796 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:29:09 compute-1 nova_compute[183751]: 2026-01-27 22:29:09.797 183755 DEBUG nova.virt.libvirt.driver [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:29:09 compute-1 podman[218185]: 2026-01-27 22:29:09.811786966 +0000 UTC m=+0.151406520 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20260126)
Jan 27 22:29:10 compute-1 nova_compute[183751]: 2026-01-27 22:29:10.317 183755 INFO nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Took 9.97 seconds to spawn the instance on the hypervisor.
Jan 27 22:29:10 compute-1 nova_compute[183751]: 2026-01-27 22:29:10.317 183755 DEBUG nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:29:10 compute-1 nova_compute[183751]: 2026-01-27 22:29:10.552 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:10 compute-1 nova_compute[183751]: 2026-01-27 22:29:10.856 183755 INFO nova.compute.manager [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Took 15.42 seconds to build instance.
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.047 183755 DEBUG nova.compute.manager [req-f9bd26e1-64f1-4e80-9a3d-b7c9e331d2bd req-f828b52d-6ef7-4ef6-905d-199489aa7508 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-vif-plugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.048 183755 DEBUG oslo_concurrency.lockutils [req-f9bd26e1-64f1-4e80-9a3d-b7c9e331d2bd req-f828b52d-6ef7-4ef6-905d-199489aa7508 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.048 183755 DEBUG oslo_concurrency.lockutils [req-f9bd26e1-64f1-4e80-9a3d-b7c9e331d2bd req-f828b52d-6ef7-4ef6-905d-199489aa7508 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.048 183755 DEBUG oslo_concurrency.lockutils [req-f9bd26e1-64f1-4e80-9a3d-b7c9e331d2bd req-f828b52d-6ef7-4ef6-905d-199489aa7508 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.049 183755 DEBUG nova.compute.manager [req-f9bd26e1-64f1-4e80-9a3d-b7c9e331d2bd req-f828b52d-6ef7-4ef6-905d-199489aa7508 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] No waiting events found dispatching network-vif-plugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.049 183755 WARNING nova.compute.manager [req-f9bd26e1-64f1-4e80-9a3d-b7c9e331d2bd req-f828b52d-6ef7-4ef6-905d-199489aa7508 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received unexpected event network-vif-plugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 for instance with vm_state active and task_state None.
Jan 27 22:29:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:11.276 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:11.276 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:11.277 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.363 183755 DEBUG oslo_concurrency.lockutils [None req-5eadea90-56d1-4fbb-bcb3-4105720e2f10 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.944s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:11 compute-1 nova_compute[183751]: 2026-01-27 22:29:11.548 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:12 compute-1 nova_compute[183751]: 2026-01-27 22:29:12.942 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:12 compute-1 nova_compute[183751]: 2026-01-27 22:29:12.943 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:12 compute-1 nova_compute[183751]: 2026-01-27 22:29:12.944 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:12 compute-1 nova_compute[183751]: 2026-01-27 22:29:12.944 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:12 compute-1 nova_compute[183751]: 2026-01-27 22:29:12.944 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:12 compute-1 nova_compute[183751]: 2026-01-27 22:29:12.958 183755 INFO nova.compute.manager [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Terminating instance
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.477 183755 DEBUG nova.compute.manager [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:29:13 compute-1 kernel: tap0a73a1cc-3c (unregistering): left promiscuous mode
Jan 27 22:29:13 compute-1 NetworkManager[56069]: <info>  [1769552953.5038] device (tap0a73a1cc-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.522 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:13 compute-1 ovn_controller[95969]: 2026-01-27T22:29:13Z|00088|binding|INFO|Releasing lport 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 from this chassis (sb_readonly=0)
Jan 27 22:29:13 compute-1 ovn_controller[95969]: 2026-01-27T22:29:13Z|00089|binding|INFO|Setting lport 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 down in Southbound
Jan 27 22:29:13 compute-1 ovn_controller[95969]: 2026-01-27T22:29:13Z|00090|binding|INFO|Removing iface tap0a73a1cc-3c ovn-installed in OVS
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.524 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.531 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:64:53 10.100.0.13'], port_security=['fa:16:3e:79:64:53 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3b489138-954b-41f7-ad29-32ba9abe9bc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a3669bc840040159a7655f1b219810c', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f3a2e630-65c9-42f0-93ec-c493ad0a7683', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4db3b16f-6743-4c2d-a5f9-9b2b6d3313bd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.532 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 in datapath d5f6cbcb-0015-404b-a15e-d163be3d6b1a unbound from our chassis
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.534 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5f6cbcb-0015-404b-a15e-d163be3d6b1a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.535 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa51357-7fef-4b0c-82b4-484655541c2a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.535 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a namespace which is not needed anymore
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.544 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:13 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 27 22:29:13 compute-1 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Consumed 4.957s CPU time.
Jan 27 22:29:13 compute-1 systemd-machined[155034]: Machine qemu-6-instance-0000000c terminated.
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.661 183755 DEBUG nova.compute.manager [req-a48676fd-ceea-47b0-b8bf-9b5e2c8c5e42 req-5d574d73-ab7e-49a8-b987-af28df7ed070 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-vif-unplugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.662 183755 DEBUG oslo_concurrency.lockutils [req-a48676fd-ceea-47b0-b8bf-9b5e2c8c5e42 req-5d574d73-ab7e-49a8-b987-af28df7ed070 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.662 183755 DEBUG oslo_concurrency.lockutils [req-a48676fd-ceea-47b0-b8bf-9b5e2c8c5e42 req-5d574d73-ab7e-49a8-b987-af28df7ed070 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.663 183755 DEBUG oslo_concurrency.lockutils [req-a48676fd-ceea-47b0-b8bf-9b5e2c8c5e42 req-5d574d73-ab7e-49a8-b987-af28df7ed070 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.663 183755 DEBUG nova.compute.manager [req-a48676fd-ceea-47b0-b8bf-9b5e2c8c5e42 req-5d574d73-ab7e-49a8-b987-af28df7ed070 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] No waiting events found dispatching network-vif-unplugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.664 183755 DEBUG nova.compute.manager [req-a48676fd-ceea-47b0-b8bf-9b5e2c8c5e42 req-5d574d73-ab7e-49a8-b987-af28df7ed070 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-vif-unplugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:29:13 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[218188]: [NOTICE]   (218208) : haproxy version is 3.0.5-8e879a5
Jan 27 22:29:13 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[218188]: [NOTICE]   (218208) : path to executable is /usr/sbin/haproxy
Jan 27 22:29:13 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[218188]: [WARNING]  (218208) : Exiting Master process...
Jan 27 22:29:13 compute-1 podman[218255]: 2026-01-27 22:29:13.693379104 +0000 UTC m=+0.037995250 container kill f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 22:29:13 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[218188]: [ALERT]    (218208) : Current worker (218215) exited with code 143 (Terminated)
Jan 27 22:29:13 compute-1 neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a[218188]: [WARNING]  (218208) : All workers exited. Exiting... (0)
Jan 27 22:29:13 compute-1 systemd[1]: libpod-f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6.scope: Deactivated successfully.
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.748 183755 INFO nova.virt.libvirt.driver [-] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Instance destroyed successfully.
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.749 183755 DEBUG nova.objects.instance [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lazy-loading 'resources' on Instance uuid 3b489138-954b-41f7-ad29-32ba9abe9bc1 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:29:13 compute-1 podman[218274]: 2026-01-27 22:29:13.765485735 +0000 UTC m=+0.045070544 container died f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Jan 27 22:29:13 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6-userdata-shm.mount: Deactivated successfully.
Jan 27 22:29:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-bad692c5f86c2e1f32f7d5776008d8c60d8a318b4574bef45487b6d8e01d39d6-merged.mount: Deactivated successfully.
Jan 27 22:29:13 compute-1 podman[218274]: 2026-01-27 22:29:13.810411295 +0000 UTC m=+0.089996014 container cleanup f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:29:13 compute-1 systemd[1]: libpod-conmon-f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6.scope: Deactivated successfully.
Jan 27 22:29:13 compute-1 podman[218280]: 2026-01-27 22:29:13.830333757 +0000 UTC m=+0.090174159 container remove f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.838 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b05f31-d5b6-4bf5-ae79-f47ef08228dc]: (4, ("Tue Jan 27 10:29:13 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a (f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6)\nf89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6\nTue Jan 27 10:29:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a (f89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6)\nf89d1af12212830a6b97dc15bcdd9ca519707c186c17179b96b7204973abd6c6\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.840 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7722ed-4e05-4ca1-85f2-ecfb7732059f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.841 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5f6cbcb-0015-404b-a15e-d163be3d6b1a.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.841 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3348160b-4c7b-4432-acf2-588af32c5140]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.842 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5f6cbcb-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.885 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:13 compute-1 kernel: tapd5f6cbcb-00: left promiscuous mode
Jan 27 22:29:13 compute-1 nova_compute[183751]: 2026-01-27 22:29:13.916 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.919 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f92d5684-8c0e-44c7-86f6-84d314200739]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.934 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[083267bc-be43-4f74-aa99-799bddcdb4ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.935 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf9d90a-74dc-41cb-a295-5ad817ac4ba9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.952 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7381ca48-910b-4910-bed4-364332ddf1bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 909096, 'reachable_time': 29777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218319, 'error': None, 'target': 'ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:13 compute-1 systemd[1]: run-netns-ovnmeta\x2dd5f6cbcb\x2d0015\x2d404b\x2da15e\x2dd163be3d6b1a.mount: Deactivated successfully.
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.957 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d5f6cbcb-0015-404b-a15e-d163be3d6b1a deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:29:13 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:13.957 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[305a90aa-717c-4196-8500-925b8beca4e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.256 183755 DEBUG nova.virt.libvirt.vif [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:28:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1121773310',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-112',id=12,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:29:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a3669bc840040159a7655f1b219810c',ramdisk_id='',reservation_id='r-f00k0lr8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-1928383885-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:29:10Z,user_data=None,user_id='4b9b219e067b4a669e9564e586cb41cd',uuid=3b489138-954b-41f7-ad29-32ba9abe9bc1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.257 183755 DEBUG nova.network.os_vif_util [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converting VIF {"id": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "address": "fa:16:3e:79:64:53", "network": {"id": "d5f6cbcb-0015-404b-a15e-d163be3d6b1a", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-2032210722-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e10517e36e1d445eb9e5770571d01f35", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a73a1cc-3c", "ovs_interfaceid": "0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.258 183755 DEBUG nova.network.os_vif_util [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:64:53,bridge_name='br-int',has_traffic_filtering=True,id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a73a1cc-3c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.258 183755 DEBUG os_vif [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:64:53,bridge_name='br-int',has_traffic_filtering=True,id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a73a1cc-3c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.261 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.261 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a73a1cc-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.263 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.266 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.267 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.267 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c82d2ff5-39e1-4e19-a216-0fa27c299a6f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.268 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.269 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.271 183755 INFO os_vif [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:64:53,bridge_name='br-int',has_traffic_filtering=True,id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2,network=Network(d5f6cbcb-0015-404b-a15e-d163be3d6b1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a73a1cc-3c')
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.272 183755 INFO nova.virt.libvirt.driver [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Deleting instance files /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1_del
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.272 183755 INFO nova.virt.libvirt.driver [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Deletion of /var/lib/nova/instances/3b489138-954b-41f7-ad29-32ba9abe9bc1_del complete
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.787 183755 INFO nova.compute.manager [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Took 1.31 seconds to destroy the instance on the hypervisor.
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.787 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.788 183755 DEBUG nova.compute.manager [-] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.788 183755 DEBUG nova.network.neutron [-] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:29:14 compute-1 nova_compute[183751]: 2026-01-27 22:29:14.788 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.335 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.739 183755 DEBUG nova.compute.manager [req-a0252ee9-96d8-4e5d-8d4e-899f786088b9 req-23029e0c-55dc-45c3-9f7a-e8d82acafefd 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-vif-unplugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.739 183755 DEBUG oslo_concurrency.lockutils [req-a0252ee9-96d8-4e5d-8d4e-899f786088b9 req-23029e0c-55dc-45c3-9f7a-e8d82acafefd 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.740 183755 DEBUG oslo_concurrency.lockutils [req-a0252ee9-96d8-4e5d-8d4e-899f786088b9 req-23029e0c-55dc-45c3-9f7a-e8d82acafefd 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.740 183755 DEBUG oslo_concurrency.lockutils [req-a0252ee9-96d8-4e5d-8d4e-899f786088b9 req-23029e0c-55dc-45c3-9f7a-e8d82acafefd 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.740 183755 DEBUG nova.compute.manager [req-a0252ee9-96d8-4e5d-8d4e-899f786088b9 req-23029e0c-55dc-45c3-9f7a-e8d82acafefd 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] No waiting events found dispatching network-vif-unplugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.740 183755 DEBUG nova.compute.manager [req-a0252ee9-96d8-4e5d-8d4e-899f786088b9 req-23029e0c-55dc-45c3-9f7a-e8d82acafefd 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-vif-unplugged-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.751 183755 DEBUG nova.compute.manager [req-c47ab0dc-c1cf-44e1-985b-1e4c5c15a413 req-63e2cce6-9d18-4e91-bfca-1d11b5b05f85 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Received event network-vif-deleted-0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.752 183755 INFO nova.compute.manager [req-c47ab0dc-c1cf-44e1-985b-1e4c5c15a413 req-63e2cce6-9d18-4e91-bfca-1d11b5b05f85 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Neutron deleted interface 0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2; detaching it from the instance and deleting it from the info cache
Jan 27 22:29:15 compute-1 nova_compute[183751]: 2026-01-27 22:29:15.752 183755 DEBUG nova.network.neutron [req-c47ab0dc-c1cf-44e1-985b-1e4c5c15a413 req-63e2cce6-9d18-4e91-bfca-1d11b5b05f85 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:29:15 compute-1 podman[218321]: 2026-01-27 22:29:15.780937558 +0000 UTC m=+0.082637263 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:29:15 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:15.789 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:29:15 compute-1 podman[218320]: 2026-01-27 22:29:15.801858814 +0000 UTC m=+0.102296917 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=openstack_network_exporter)
Jan 27 22:29:16 compute-1 nova_compute[183751]: 2026-01-27 22:29:16.183 183755 DEBUG nova.network.neutron [-] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:29:16 compute-1 nova_compute[183751]: 2026-01-27 22:29:16.261 183755 DEBUG nova.compute.manager [req-c47ab0dc-c1cf-44e1-985b-1e4c5c15a413 req-63e2cce6-9d18-4e91-bfca-1d11b5b05f85 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Detach interface failed, port_id=0a73a1cc-3ccc-4bb8-a416-6a59ee2812a2, reason: Instance 3b489138-954b-41f7-ad29-32ba9abe9bc1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 27 22:29:16 compute-1 nova_compute[183751]: 2026-01-27 22:29:16.547 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:16 compute-1 nova_compute[183751]: 2026-01-27 22:29:16.691 183755 INFO nova.compute.manager [-] [instance: 3b489138-954b-41f7-ad29-32ba9abe9bc1] Took 1.90 seconds to deallocate network for instance.
Jan 27 22:29:17 compute-1 nova_compute[183751]: 2026-01-27 22:29:17.213 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:17 compute-1 nova_compute[183751]: 2026-01-27 22:29:17.214 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:17 compute-1 nova_compute[183751]: 2026-01-27 22:29:17.306 183755 DEBUG nova.compute.provider_tree [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:29:17 compute-1 nova_compute[183751]: 2026-01-27 22:29:17.816 183755 DEBUG nova.scheduler.client.report [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:29:18 compute-1 nova_compute[183751]: 2026-01-27 22:29:18.329 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:18 compute-1 nova_compute[183751]: 2026-01-27 22:29:18.378 183755 INFO nova.scheduler.client.report [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Deleted allocations for instance 3b489138-954b-41f7-ad29-32ba9abe9bc1
Jan 27 22:29:19 compute-1 nova_compute[183751]: 2026-01-27 22:29:19.270 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:19 compute-1 nova_compute[183751]: 2026-01-27 22:29:19.412 183755 DEBUG oslo_concurrency.lockutils [None req-1f10996b-639d-4cf9-b5c5-b31764e4439a 4b9b219e067b4a669e9564e586cb41cd 1a3669bc840040159a7655f1b219810c - - default default] Lock "3b489138-954b-41f7-ad29-32ba9abe9bc1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.469s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:19 compute-1 openstack_network_exporter[195945]: ERROR   22:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:29:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:29:19 compute-1 openstack_network_exporter[195945]: ERROR   22:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:29:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:29:20 compute-1 sshd-session[218358]: Received disconnect from 45.148.10.151 port 40398:11:  [preauth]
Jan 27 22:29:20 compute-1 sshd-session[218358]: Disconnected from authenticating user root 45.148.10.151 port 40398 [preauth]
Jan 27 22:29:21 compute-1 nova_compute[183751]: 2026-01-27 22:29:21.585 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:24 compute-1 nova_compute[183751]: 2026-01-27 22:29:24.271 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:24 compute-1 podman[218360]: 2026-01-27 22:29:24.762248801 +0000 UTC m=+0.069588140 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:29:26 compute-1 nova_compute[183751]: 2026-01-27 22:29:26.589 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:29 compute-1 nova_compute[183751]: 2026-01-27 22:29:29.273 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:31 compute-1 nova_compute[183751]: 2026-01-27 22:29:31.590 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:33 compute-1 nova_compute[183751]: 2026-01-27 22:29:33.230 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:34 compute-1 nova_compute[183751]: 2026-01-27 22:29:34.275 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:35 compute-1 podman[193064]: time="2026-01-27T22:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:29:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:29:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2184 "" "Go-http-client/1.1"
Jan 27 22:29:36 compute-1 nova_compute[183751]: 2026-01-27 22:29:36.637 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:39 compute-1 nova_compute[183751]: 2026-01-27 22:29:39.277 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:40 compute-1 podman[218386]: 2026-01-27 22:29:40.819875014 +0000 UTC m=+0.127265894 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 22:29:41 compute-1 nova_compute[183751]: 2026-01-27 22:29:41.639 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:43.880 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:02:d0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf213dc42fc04bdbb0c321ce189cfdb8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a01a449d-053f-4da0-a741-7ac912b0905f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=69c7ff2f-6330-402c-b9df-c8f2c509c282) old=Port_Binding(mac=['fa:16:3e:54:02:d0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf213dc42fc04bdbb0c321ce189cfdb8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:29:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:43.882 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 69c7ff2f-6330-402c-b9df-c8f2c509c282 in datapath bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 updated
Jan 27 22:29:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:43.883 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:29:43 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:29:43.885 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a35cab98-d462-458a-9a06-a833e99fc72e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:29:44 compute-1 nova_compute[183751]: 2026-01-27 22:29:44.279 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:46 compute-1 nova_compute[183751]: 2026-01-27 22:29:46.695 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:46 compute-1 podman[218413]: 2026-01-27 22:29:46.779651054 +0000 UTC m=+0.053189685 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20260126, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 27 22:29:46 compute-1 podman[218412]: 2026-01-27 22:29:46.811300696 +0000 UTC m=+0.079779672 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 27 22:29:49 compute-1 nova_compute[183751]: 2026-01-27 22:29:49.281 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:49 compute-1 openstack_network_exporter[195945]: ERROR   22:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:29:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:29:49 compute-1 openstack_network_exporter[195945]: ERROR   22:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:29:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:29:51 compute-1 nova_compute[183751]: 2026-01-27 22:29:51.697 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:54 compute-1 nova_compute[183751]: 2026-01-27 22:29:54.283 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:55 compute-1 podman[218452]: 2026-01-27 22:29:55.810555054 +0000 UTC m=+0.110594963 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:29:56 compute-1 nova_compute[183751]: 2026-01-27 22:29:56.698 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.668 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.941 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.943 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.982 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.983 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5846MB free_disk=73.14262008666992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.983 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:29:58 compute-1 nova_compute[183751]: 2026-01-27 22:29:58.984 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:29:59 compute-1 nova_compute[183751]: 2026-01-27 22:29:59.286 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:00 compute-1 nova_compute[183751]: 2026-01-27 22:30:00.054 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:30:00 compute-1 nova_compute[183751]: 2026-01-27 22:30:00.054 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:29:58 up  2:32,  0 user,  load average: 0.35, 0.30, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:30:00 compute-1 nova_compute[183751]: 2026-01-27 22:30:00.190 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:30:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:00.666 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:46:2a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a7e40da5-530e-4947-9a09-4f0ecd063758', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7e40da5-530e-4947-9a09-4f0ecd063758', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f5e7d0ea0cbe4c4f9e5e1490f6110cec', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c97ddbff-a437-4531-804e-6f5013a4679c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b2178d96-77d2-487e-8839-4d617716d128) old=Port_Binding(mac=['fa:16:3e:9e:46:2a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a7e40da5-530e-4947-9a09-4f0ecd063758', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a7e40da5-530e-4947-9a09-4f0ecd063758', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f5e7d0ea0cbe4c4f9e5e1490f6110cec', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:30:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:00.667 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b2178d96-77d2-487e-8839-4d617716d128 in datapath a7e40da5-530e-4947-9a09-4f0ecd063758 updated
Jan 27 22:30:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:00.668 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a7e40da5-530e-4947-9a09-4f0ecd063758, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:30:00 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:00.670 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[955eca3c-2b47-4348-afa8-8955ffb5239c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:00 compute-1 nova_compute[183751]: 2026-01-27 22:30:00.699 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:30:01 compute-1 nova_compute[183751]: 2026-01-27 22:30:01.211 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:30:01 compute-1 nova_compute[183751]: 2026-01-27 22:30:01.212 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.228s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:01 compute-1 nova_compute[183751]: 2026-01-27 22:30:01.702 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:03 compute-1 nova_compute[183751]: 2026-01-27 22:30:03.211 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:03 compute-1 nova_compute[183751]: 2026-01-27 22:30:03.212 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:03 compute-1 nova_compute[183751]: 2026-01-27 22:30:03.212 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:03 compute-1 nova_compute[183751]: 2026-01-27 22:30:03.212 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:03 compute-1 nova_compute[183751]: 2026-01-27 22:30:03.212 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:04 compute-1 nova_compute[183751]: 2026-01-27 22:30:04.290 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:05 compute-1 nova_compute[183751]: 2026-01-27 22:30:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:05 compute-1 nova_compute[183751]: 2026-01-27 22:30:05.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:30:05 compute-1 podman[193064]: time="2026-01-27T22:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:30:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:30:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 27 22:30:06 compute-1 nova_compute[183751]: 2026-01-27 22:30:06.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:06 compute-1 ovn_controller[95969]: 2026-01-27T22:30:06Z|00091|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 27 22:30:06 compute-1 nova_compute[183751]: 2026-01-27 22:30:06.704 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:09 compute-1 nova_compute[183751]: 2026-01-27 22:30:09.292 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:11.278 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:11.279 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:11.279 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:11 compute-1 nova_compute[183751]: 2026-01-27 22:30:11.708 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:11 compute-1 podman[218480]: 2026-01-27 22:30:11.842994814 +0000 UTC m=+0.138666856 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 27 22:30:14 compute-1 nova_compute[183751]: 2026-01-27 22:30:14.294 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:16 compute-1 nova_compute[183751]: 2026-01-27 22:30:16.744 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:17 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:17.072 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:30:17 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:17.075 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:30:17 compute-1 nova_compute[183751]: 2026-01-27 22:30:17.074 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:17 compute-1 podman[218510]: 2026-01-27 22:30:17.78113438 +0000 UTC m=+0.078297354 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 27 22:30:17 compute-1 podman[218509]: 2026-01-27 22:30:17.792818288 +0000 UTC m=+0.091258494 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 27 22:30:18 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:18.076 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:19 compute-1 nova_compute[183751]: 2026-01-27 22:30:19.296 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:19 compute-1 openstack_network_exporter[195945]: ERROR   22:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:30:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:30:19 compute-1 openstack_network_exporter[195945]: ERROR   22:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:30:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:30:21 compute-1 nova_compute[183751]: 2026-01-27 22:30:21.781 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:24 compute-1 nova_compute[183751]: 2026-01-27 22:30:24.298 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:26 compute-1 nova_compute[183751]: 2026-01-27 22:30:26.784 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:26 compute-1 podman[218547]: 2026-01-27 22:30:26.800630046 +0000 UTC m=+0.088922417 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:30:28 compute-1 nova_compute[183751]: 2026-01-27 22:30:28.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:30:29 compute-1 nova_compute[183751]: 2026-01-27 22:30:29.300 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:31 compute-1 nova_compute[183751]: 2026-01-27 22:30:31.785 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:34 compute-1 nova_compute[183751]: 2026-01-27 22:30:34.302 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:35 compute-1 podman[193064]: time="2026-01-27T22:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:30:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:30:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Jan 27 22:30:36 compute-1 nova_compute[183751]: 2026-01-27 22:30:36.808 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:39 compute-1 nova_compute[183751]: 2026-01-27 22:30:39.304 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:40 compute-1 nova_compute[183751]: 2026-01-27 22:30:40.272 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:40 compute-1 nova_compute[183751]: 2026-01-27 22:30:40.272 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:40 compute-1 nova_compute[183751]: 2026-01-27 22:30:40.779 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:30:41 compute-1 nova_compute[183751]: 2026-01-27 22:30:41.331 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:41 compute-1 nova_compute[183751]: 2026-01-27 22:30:41.332 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:41 compute-1 nova_compute[183751]: 2026-01-27 22:30:41.341 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:30:41 compute-1 nova_compute[183751]: 2026-01-27 22:30:41.342 183755 INFO nova.compute.claims [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:30:41 compute-1 nova_compute[183751]: 2026-01-27 22:30:41.811 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:42 compute-1 nova_compute[183751]: 2026-01-27 22:30:42.414 183755 DEBUG nova.compute.provider_tree [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:30:42 compute-1 podman[218573]: 2026-01-27 22:30:42.835354824 +0000 UTC m=+0.151038041 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126)
Jan 27 22:30:42 compute-1 nova_compute[183751]: 2026-01-27 22:30:42.923 183755 DEBUG nova.scheduler.client.report [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:30:43 compute-1 nova_compute[183751]: 2026-01-27 22:30:43.434 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:43 compute-1 nova_compute[183751]: 2026-01-27 22:30:43.435 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:30:43 compute-1 nova_compute[183751]: 2026-01-27 22:30:43.952 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:30:43 compute-1 nova_compute[183751]: 2026-01-27 22:30:43.952 183755 DEBUG nova.network.neutron [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:30:43 compute-1 nova_compute[183751]: 2026-01-27 22:30:43.953 183755 WARNING neutronclient.v2_0.client [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:30:43 compute-1 nova_compute[183751]: 2026-01-27 22:30:43.953 183755 WARNING neutronclient.v2_0.client [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:30:44 compute-1 nova_compute[183751]: 2026-01-27 22:30:44.306 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:44 compute-1 nova_compute[183751]: 2026-01-27 22:30:44.461 183755 INFO nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:30:44 compute-1 nova_compute[183751]: 2026-01-27 22:30:44.970 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:30:45 compute-1 nova_compute[183751]: 2026-01-27 22:30:45.897 183755 DEBUG nova.network.neutron [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Successfully created port: f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.008 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.010 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.011 183755 INFO nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Creating image(s)
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.012 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "/var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.013 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "/var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.014 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "/var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.015 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.022 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.024 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.103 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.104 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.105 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.105 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.109 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.110 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.198 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.199 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.236 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.238 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.238 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.318 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.319 183755 DEBUG nova.virt.disk.api [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Checking if we can resize image /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.320 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.373 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.375 183755 DEBUG nova.virt.disk.api [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Cannot resize image /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.375 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.376 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Ensure instance console log exists: /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.377 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.377 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.378 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:46 compute-1 nova_compute[183751]: 2026-01-27 22:30:46.822 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:47 compute-1 nova_compute[183751]: 2026-01-27 22:30:47.591 183755 DEBUG nova.network.neutron [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Successfully updated port: f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:30:47 compute-1 nova_compute[183751]: 2026-01-27 22:30:47.734 183755 DEBUG nova.compute.manager [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-changed-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:30:47 compute-1 nova_compute[183751]: 2026-01-27 22:30:47.735 183755 DEBUG nova.compute.manager [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Refreshing instance network info cache due to event network-changed-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:30:47 compute-1 nova_compute[183751]: 2026-01-27 22:30:47.735 183755 DEBUG oslo_concurrency.lockutils [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-8ef7d52b-ca91-4353-ba68-c3e66978c93d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:30:47 compute-1 nova_compute[183751]: 2026-01-27 22:30:47.736 183755 DEBUG oslo_concurrency.lockutils [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-8ef7d52b-ca91-4353-ba68-c3e66978c93d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:30:47 compute-1 nova_compute[183751]: 2026-01-27 22:30:47.736 183755 DEBUG nova.network.neutron [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Refreshing network info cache for port f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:30:48 compute-1 nova_compute[183751]: 2026-01-27 22:30:48.101 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "refresh_cache-8ef7d52b-ca91-4353-ba68-c3e66978c93d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:30:48 compute-1 nova_compute[183751]: 2026-01-27 22:30:48.244 183755 WARNING neutronclient.v2_0.client [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:30:48 compute-1 nova_compute[183751]: 2026-01-27 22:30:48.576 183755 DEBUG nova.network.neutron [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:30:48 compute-1 podman[218614]: 2026-01-27 22:30:48.814033871 +0000 UTC m=+0.100988955 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Jan 27 22:30:48 compute-1 podman[218615]: 2026-01-27 22:30:48.828426247 +0000 UTC m=+0.116250292 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Jan 27 22:30:48 compute-1 nova_compute[183751]: 2026-01-27 22:30:48.898 183755 DEBUG nova.network.neutron [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:30:49 compute-1 nova_compute[183751]: 2026-01-27 22:30:49.308 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:49 compute-1 nova_compute[183751]: 2026-01-27 22:30:49.409 183755 DEBUG oslo_concurrency.lockutils [req-c50d23d2-503a-4bd2-affd-ef1d430b4e78 req-1424e351-2d5e-4ff2-852c-35a36e82bf72 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-8ef7d52b-ca91-4353-ba68-c3e66978c93d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:30:49 compute-1 nova_compute[183751]: 2026-01-27 22:30:49.410 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquired lock "refresh_cache-8ef7d52b-ca91-4353-ba68-c3e66978c93d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:30:49 compute-1 nova_compute[183751]: 2026-01-27 22:30:49.411 183755 DEBUG nova.network.neutron [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:30:49 compute-1 openstack_network_exporter[195945]: ERROR   22:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:30:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:30:49 compute-1 openstack_network_exporter[195945]: ERROR   22:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:30:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:30:50 compute-1 nova_compute[183751]: 2026-01-27 22:30:50.048 183755 DEBUG nova.network.neutron [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:30:50 compute-1 nova_compute[183751]: 2026-01-27 22:30:50.388 183755 WARNING neutronclient.v2_0.client [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:30:50 compute-1 nova_compute[183751]: 2026-01-27 22:30:50.665 183755 DEBUG nova.network.neutron [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Updating instance_info_cache with network_info: [{"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.172 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Releasing lock "refresh_cache-8ef7d52b-ca91-4353-ba68-c3e66978c93d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.174 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Instance network_info: |[{"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.178 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Start _get_guest_xml network_info=[{"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.184 183755 WARNING nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.186 183755 DEBUG nova.virt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2039109839', uuid='8ef7d52b-ca91-4353-ba68-c3e66978c93d'), owner=OwnerMeta(userid='c1895fcc128e498c8220a133604740e8', username='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824-project-admin', projectid='f5e7d0ea0cbe4c4f9e5e1490f6110cec', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769553051.1861937) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.193 183755 DEBUG nova.virt.libvirt.host [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.194 183755 DEBUG nova.virt.libvirt.host [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.198 183755 DEBUG nova.virt.libvirt.host [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.198 183755 DEBUG nova.virt.libvirt.host [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.201 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.201 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.202 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.202 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.203 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.203 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.203 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.204 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.204 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.205 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.205 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.205 183755 DEBUG nova.virt.hardware [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.212 183755 DEBUG nova.virt.libvirt.vif [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:30:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2039109839',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2039109839',id=14,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f5e7d0ea0cbe4c4f9e5e1490f6110cec',ramdisk_id='',reservation_id='r-wrzr4otu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:30:45Z,user_data=None,user_id='c1895fcc128e498c8220a133604740e8',uuid=8ef7d52b-ca91-4353-ba68-c3e66978c93d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.212 183755 DEBUG nova.network.os_vif_util [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Converting VIF {"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.214 183755 DEBUG nova.network.os_vif_util [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:3a,bridge_name='br-int',has_traffic_filtering=True,id=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0,network=Network(bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e0382c-6a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.216 183755 DEBUG nova.objects.instance [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ef7d52b-ca91-4353-ba68-c3e66978c93d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.727 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <uuid>8ef7d52b-ca91-4353-ba68-c3e66978c93d</uuid>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <name>instance-0000000e</name>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-2039109839</nova:name>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:30:51</nova:creationTime>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:30:51 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:30:51 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:user uuid="c1895fcc128e498c8220a133604740e8">tempest-TestExecuteVmWorkloadBalanceStrategy-567132824-project-admin</nova:user>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:project uuid="f5e7d0ea0cbe4c4f9e5e1490f6110cec">tempest-TestExecuteVmWorkloadBalanceStrategy-567132824</nova:project>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         <nova:port uuid="f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0">
Jan 27 22:30:51 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <system>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <entry name="serial">8ef7d52b-ca91-4353-ba68-c3e66978c93d</entry>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <entry name="uuid">8ef7d52b-ca91-4353-ba68-c3e66978c93d</entry>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </system>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <os>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   </os>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <features>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   </features>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk.config"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:5f:14:3a"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <target dev="tapf0e0382c-6a"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/console.log" append="off"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <video>
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </video>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:30:51 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:30:51 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:30:51 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:30:51 compute-1 nova_compute[183751]: </domain>
Jan 27 22:30:51 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.729 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Preparing to wait for external event network-vif-plugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.729 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.730 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.730 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.731 183755 DEBUG nova.virt.libvirt.vif [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:30:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2039109839',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2039109839',id=14,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f5e7d0ea0cbe4c4f9e5e1490f6110cec',ramdisk_id='',reservation_id='r-wrzr4otu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:30:45Z,user_data=None,user_id='c1895fcc128e498c8220a133604740e8',uuid=8ef7d52b-ca91-4353-ba68-c3e66978c93d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.731 183755 DEBUG nova.network.os_vif_util [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Converting VIF {"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.732 183755 DEBUG nova.network.os_vif_util [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:3a,bridge_name='br-int',has_traffic_filtering=True,id=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0,network=Network(bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e0382c-6a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.732 183755 DEBUG os_vif [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:3a,bridge_name='br-int',has_traffic_filtering=True,id=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0,network=Network(bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e0382c-6a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.733 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.733 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.733 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.734 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.734 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5d746b3a-e759-55cb-a3e7-df2719aa224c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.736 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.738 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.738 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.741 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.742 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0e0382c-6a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.742 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf0e0382c-6a, col_values=(('qos', UUID('e58ca302-75d9-44a3-88ce-c17c2df20973')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.743 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf0e0382c-6a, col_values=(('external_ids', {'iface-id': 'f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:14:3a', 'vm-uuid': '8ef7d52b-ca91-4353-ba68-c3e66978c93d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.745 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:51 compute-1 NetworkManager[56069]: <info>  [1769553051.7470] manager: (tapf0e0382c-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.747 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.756 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.757 183755 INFO os_vif [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:3a,bridge_name='br-int',has_traffic_filtering=True,id=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0,network=Network(bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e0382c-6a')
Jan 27 22:30:51 compute-1 nova_compute[183751]: 2026-01-27 22:30:51.823 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:53 compute-1 nova_compute[183751]: 2026-01-27 22:30:53.301 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:30:53 compute-1 nova_compute[183751]: 2026-01-27 22:30:53.302 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:30:53 compute-1 nova_compute[183751]: 2026-01-27 22:30:53.302 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] No VIF found with MAC fa:16:3e:5f:14:3a, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:30:53 compute-1 nova_compute[183751]: 2026-01-27 22:30:53.303 183755 INFO nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Using config drive
Jan 27 22:30:53 compute-1 nova_compute[183751]: 2026-01-27 22:30:53.821 183755 WARNING neutronclient.v2_0.client [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.164 183755 INFO nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Creating config drive at /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk.config
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.177 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpkh68oy9k execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.325 183755 DEBUG oslo_concurrency.processutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpkh68oy9k" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:30:54 compute-1 kernel: tapf0e0382c-6a: entered promiscuous mode
Jan 27 22:30:54 compute-1 NetworkManager[56069]: <info>  [1769553054.4213] manager: (tapf0e0382c-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Jan 27 22:30:54 compute-1 ovn_controller[95969]: 2026-01-27T22:30:54Z|00092|binding|INFO|Claiming lport f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 for this chassis.
Jan 27 22:30:54 compute-1 ovn_controller[95969]: 2026-01-27T22:30:54Z|00093|binding|INFO|f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0: Claiming fa:16:3e:5f:14:3a 10.100.0.13
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.422 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.428 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.432 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.449 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:14:3a 10.100.0.13'], port_security=['fa:16:3e:5f:14:3a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8ef7d52b-ca91-4353-ba68-c3e66978c93d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f5e7d0ea0cbe4c4f9e5e1490f6110cec', 'neutron:revision_number': '4', 'neutron:security_group_ids': '358bb44c-2ba1-4561-a48e-1e8eb336c84e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a01a449d-053f-4da0-a741-7ac912b0905f, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.450 105247 INFO neutron.agent.ovn.metadata.agent [-] Port f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 in datapath bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 bound to our chassis
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.453 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1
Jan 27 22:30:54 compute-1 systemd-udevd[218673]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.473 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[81fe7872-15eb-4cf7-a3fa-57c5b280d2d5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.474 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbfdc4bdd-01 in ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.477 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbfdc4bdd-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.478 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd1438e-1e7c-455a-91b2-98538b8f963b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.478 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0de84d-106e-4f0e-b9f6-027ac91835e6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 systemd-machined[155034]: New machine qemu-7-instance-0000000e.
Jan 27 22:30:54 compute-1 NetworkManager[56069]: <info>  [1769553054.4923] device (tapf0e0382c-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:30:54 compute-1 NetworkManager[56069]: <info>  [1769553054.4942] device (tapf0e0382c-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.499 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[9538b303-7168-440c-8e51-7be0a6d679ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.512 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.513 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[220a95ba-83c7-4e7c-8e6b-6bea1609608c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 systemd[1]: Started Virtual Machine qemu-7-instance-0000000e.
Jan 27 22:30:54 compute-1 ovn_controller[95969]: 2026-01-27T22:30:54Z|00094|binding|INFO|Setting lport f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 ovn-installed in OVS
Jan 27 22:30:54 compute-1 ovn_controller[95969]: 2026-01-27T22:30:54Z|00095|binding|INFO|Setting lport f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 up in Southbound
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.522 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.549 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[7804ca15-04be-491e-a1b3-16d389a8a26f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.557 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[01b384df-fa5f-496c-9c5f-05566cd0d194]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 NetworkManager[56069]: <info>  [1769553054.5595] manager: (tapbfdc4bdd-00): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.596 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b269b1-6d01-4c80-9e9f-bb978efa94d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.600 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[f7943c5c-31ba-4f31-84c0-837861b7acc5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 NetworkManager[56069]: <info>  [1769553054.6308] device (tapbfdc4bdd-00): carrier: link connected
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.640 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[15e4e2a9-21fa-4a7f-b208-52b2992855f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.667 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6118d31e-3718-4a64-881f-354699b57dba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbfdc4bdd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:02:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 919689, 'reachable_time': 23252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218706, 'error': None, 'target': 'ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.686 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7974c843-51fc-4e6d-8968-43253bf4b4c3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe54:2d0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 919689, 'tstamp': 919689}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218707, 'error': None, 'target': 'ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.706 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c39e527b-b789-40bf-9b62-21fd413dbec7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbfdc4bdd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:54:02:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 919689, 'reachable_time': 23252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218708, 'error': None, 'target': 'ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.749 183755 DEBUG nova.compute.manager [req-ecc7d75b-0976-42f4-946e-84bcd73c68de req-302c9947-75b8-4ff1-9d35-f94e6f0b086f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-vif-plugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.750 183755 DEBUG oslo_concurrency.lockutils [req-ecc7d75b-0976-42f4-946e-84bcd73c68de req-302c9947-75b8-4ff1-9d35-f94e6f0b086f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.750 183755 DEBUG oslo_concurrency.lockutils [req-ecc7d75b-0976-42f4-946e-84bcd73c68de req-302c9947-75b8-4ff1-9d35-f94e6f0b086f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.750 183755 DEBUG oslo_concurrency.lockutils [req-ecc7d75b-0976-42f4-946e-84bcd73c68de req-302c9947-75b8-4ff1-9d35-f94e6f0b086f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.750 183755 DEBUG nova.compute.manager [req-ecc7d75b-0976-42f4-946e-84bcd73c68de req-302c9947-75b8-4ff1-9d35-f94e6f0b086f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Processing event network-vif-plugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.751 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1d0dc8-cc82-4bea-86da-b48a4bb2acf8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.845 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[eb511e13-ddce-4d66-89ee-1f9b4666578f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.847 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfdc4bdd-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.848 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.848 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfdc4bdd-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.851 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 kernel: tapbfdc4bdd-00: entered promiscuous mode
Jan 27 22:30:54 compute-1 NetworkManager[56069]: <info>  [1769553054.8517] manager: (tapbfdc4bdd-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.853 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.854 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbfdc4bdd-00, col_values=(('external_ids', {'iface-id': '69c7ff2f-6330-402c-b9df-c8f2c509c282'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.855 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 ovn_controller[95969]: 2026-01-27T22:30:54Z|00096|binding|INFO|Releasing lport 69c7ff2f-6330-402c-b9df-c8f2c509c282 from this chassis (sb_readonly=0)
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.873 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.875 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a89efef4-423e-414d-9b11-c448488c3819]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.876 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.877 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.877 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.877 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.878 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[52ae184a-c61c-4dbc-801d-4c7bc4a05976]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.878 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.878 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac3dc52-951d-4711-8d3a-8bf7690a780b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.879 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:30:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:30:54.880 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'env', 'PROCESS_TAG=haproxy-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.981 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:30:54 compute-1 nova_compute[183751]: 2026-01-27 22:30:54.993 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.011 183755 INFO nova.virt.libvirt.driver [-] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Instance spawned successfully.
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.011 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:30:55 compute-1 podman[218747]: 2026-01-27 22:30:55.376849086 +0000 UTC m=+0.069194780 container create 63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:30:55 compute-1 systemd[1]: Started libpod-conmon-63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8.scope.
Jan 27 22:30:55 compute-1 podman[218747]: 2026-01-27 22:30:55.344465846 +0000 UTC m=+0.036811520 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:30:55 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:30:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9bc4380fdc41bc88113f200a18b95679e2b98cc1a18e164f087174edfc31205/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:30:55 compute-1 podman[218747]: 2026-01-27 22:30:55.489469258 +0000 UTC m=+0.181814952 container init 63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 22:30:55 compute-1 podman[218747]: 2026-01-27 22:30:55.500934611 +0000 UTC m=+0.193280275 container start 63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.527 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.527 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.528 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.528 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.529 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:30:55 compute-1 nova_compute[183751]: 2026-01-27 22:30:55.530 183755 DEBUG nova.virt.libvirt.driver [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:30:55 compute-1 neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1[218762]: [NOTICE]   (218766) : New worker (218768) forked
Jan 27 22:30:55 compute-1 neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1[218762]: [NOTICE]   (218766) : Loading success.
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.043 183755 INFO nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Took 10.03 seconds to spawn the instance on the hypervisor.
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.044 183755 DEBUG nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.585 183755 INFO nova.compute.manager [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Took 15.30 seconds to build instance.
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.752 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.820 183755 DEBUG nova.compute.manager [req-f04c42dd-7911-46a5-875d-411d9d080b82 req-7aef30e7-4df6-4412-875c-80a288eee7ac 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-vif-plugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.821 183755 DEBUG oslo_concurrency.lockutils [req-f04c42dd-7911-46a5-875d-411d9d080b82 req-7aef30e7-4df6-4412-875c-80a288eee7ac 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.821 183755 DEBUG oslo_concurrency.lockutils [req-f04c42dd-7911-46a5-875d-411d9d080b82 req-7aef30e7-4df6-4412-875c-80a288eee7ac 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.821 183755 DEBUG oslo_concurrency.lockutils [req-f04c42dd-7911-46a5-875d-411d9d080b82 req-7aef30e7-4df6-4412-875c-80a288eee7ac 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.822 183755 DEBUG nova.compute.manager [req-f04c42dd-7911-46a5-875d-411d9d080b82 req-7aef30e7-4df6-4412-875c-80a288eee7ac 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] No waiting events found dispatching network-vif-plugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.822 183755 WARNING nova.compute.manager [req-f04c42dd-7911-46a5-875d-411d9d080b82 req-7aef30e7-4df6-4412-875c-80a288eee7ac 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received unexpected event network-vif-plugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 for instance with vm_state active and task_state None.
Jan 27 22:30:56 compute-1 nova_compute[183751]: 2026-01-27 22:30:56.826 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:30:57 compute-1 nova_compute[183751]: 2026-01-27 22:30:57.093 183755 DEBUG oslo_concurrency.lockutils [None req-c6450206-3be3-493b-baad-cbf254799ff8 c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.820s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:30:57 compute-1 podman[218777]: 2026-01-27 22:30:57.786991628 +0000 UTC m=+0.086276142 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.151 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.501 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.502 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.502 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.502 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.503 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.516 183755 INFO nova.compute.manager [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Terminating instance
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:00 compute-1 nova_compute[183751]: 2026-01-27 22:31:00.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.034 183755 DEBUG nova.compute.manager [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:31:01 compute-1 kernel: tapf0e0382c-6a (unregistering): left promiscuous mode
Jan 27 22:31:01 compute-1 NetworkManager[56069]: <info>  [1769553061.0561] device (tapf0e0382c-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:31:01 compute-1 ovn_controller[95969]: 2026-01-27T22:31:01Z|00097|binding|INFO|Releasing lport f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 from this chassis (sb_readonly=0)
Jan 27 22:31:01 compute-1 ovn_controller[95969]: 2026-01-27T22:31:01Z|00098|binding|INFO|Setting lport f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 down in Southbound
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.064 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 ovn_controller[95969]: 2026-01-27T22:31:01Z|00099|binding|INFO|Removing iface tapf0e0382c-6a ovn-installed in OVS
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.069 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.073 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:14:3a 10.100.0.13'], port_security=['fa:16:3e:5f:14:3a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8ef7d52b-ca91-4353-ba68-c3e66978c93d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f5e7d0ea0cbe4c4f9e5e1490f6110cec', 'neutron:revision_number': '5', 'neutron:security_group_ids': '358bb44c-2ba1-4561-a48e-1e8eb336c84e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a01a449d-053f-4da0-a741-7ac912b0905f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.075 105247 INFO neutron.agent.ovn.metadata.agent [-] Port f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 in datapath bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 unbound from our chassis
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.076 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.078 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4a7f7a-74e8-444c-a0a3-97fe06e3c3ac]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.078 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 namespace which is not needed anymore
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.093 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 27 22:31:01 compute-1 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Consumed 6.666s CPU time.
Jan 27 22:31:01 compute-1 systemd-machined[155034]: Machine qemu-7-instance-0000000e terminated.
Jan 27 22:31:01 compute-1 neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1[218762]: [NOTICE]   (218766) : haproxy version is 3.0.5-8e879a5
Jan 27 22:31:01 compute-1 neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1[218762]: [NOTICE]   (218766) : path to executable is /usr/sbin/haproxy
Jan 27 22:31:01 compute-1 neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1[218762]: [WARNING]  (218766) : Exiting Master process...
Jan 27 22:31:01 compute-1 podman[218827]: 2026-01-27 22:31:01.260314641 +0000 UTC m=+0.050927189 container kill 63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 22:31:01 compute-1 neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1[218762]: [ALERT]    (218766) : Current worker (218768) exited with code 143 (Terminated)
Jan 27 22:31:01 compute-1 neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1[218762]: [WARNING]  (218766) : All workers exited. Exiting... (0)
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.261 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 systemd[1]: libpod-63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8.scope: Deactivated successfully.
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.275 183755 DEBUG nova.compute.manager [req-b3a4146a-7f1d-4cbe-9111-177927b3159c req-799a6954-f5b4-4032-9151-e7b157d40b6c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-vif-unplugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.275 183755 DEBUG oslo_concurrency.lockutils [req-b3a4146a-7f1d-4cbe-9111-177927b3159c req-799a6954-f5b4-4032-9151-e7b157d40b6c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.275 183755 DEBUG oslo_concurrency.lockutils [req-b3a4146a-7f1d-4cbe-9111-177927b3159c req-799a6954-f5b4-4032-9151-e7b157d40b6c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.276 183755 DEBUG oslo_concurrency.lockutils [req-b3a4146a-7f1d-4cbe-9111-177927b3159c req-799a6954-f5b4-4032-9151-e7b157d40b6c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.276 183755 DEBUG nova.compute.manager [req-b3a4146a-7f1d-4cbe-9111-177927b3159c req-799a6954-f5b4-4032-9151-e7b157d40b6c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] No waiting events found dispatching network-vif-unplugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.276 183755 DEBUG nova.compute.manager [req-b3a4146a-7f1d-4cbe-9111-177927b3159c req-799a6954-f5b4-4032-9151-e7b157d40b6c 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-vif-unplugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.277 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 podman[218847]: 2026-01-27 22:31:01.305402394 +0000 UTC m=+0.030340360 container died 63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.314 183755 INFO nova.virt.libvirt.driver [-] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Instance destroyed successfully.
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.316 183755 DEBUG nova.objects.instance [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lazy-loading 'resources' on Instance uuid 8ef7d52b-ca91-4353-ba68-c3e66978c93d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:31:01 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8-userdata-shm.mount: Deactivated successfully.
Jan 27 22:31:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-d9bc4380fdc41bc88113f200a18b95679e2b98cc1a18e164f087174edfc31205-merged.mount: Deactivated successfully.
Jan 27 22:31:01 compute-1 podman[218847]: 2026-01-27 22:31:01.35581971 +0000 UTC m=+0.080757606 container cleanup 63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 27 22:31:01 compute-1 systemd[1]: libpod-conmon-63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8.scope: Deactivated successfully.
Jan 27 22:31:01 compute-1 podman[218863]: 2026-01-27 22:31:01.377660329 +0000 UTC m=+0.074706506 container remove 63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.385 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[497b4b48-021f-44cd-8bc1-3c31ae24a063]: (4, ("Tue Jan 27 10:31:01 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 (63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8)\n63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8\nTue Jan 27 10:31:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 (63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8)\n63d42d098638840680f542b716c018906b8ec116e83b0c9643d4e09a96f48aa8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.387 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5326ab3d-329c-4b13-9c92-5a08d245c8c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.387 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.388 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[31a30ba1-4466-449d-a479-75eeccab715d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.389 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfdc4bdd-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.392 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 kernel: tapbfdc4bdd-00: left promiscuous mode
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.408 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.412 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[2d144f8c-c292-43c5-b837-1008dc425ba6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.432 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[46a10880-8443-4727-bfac-188c651fe9a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.433 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[0246af4c-3ba3-4fb2-a919-8306a87b34d5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.453 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ed137b60-2b82-40ab-a07a-261ca27bea7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 919680, 'reachable_time': 39844, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218892, 'error': None, 'target': 'ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 systemd[1]: run-netns-ovnmeta\x2dbfdc4bdd\x2d0f80\x2d455a\x2dbcbb\x2d6d76b2cab9c1.mount: Deactivated successfully.
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.455 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:31:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:01.456 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[a7dc487c-42f3-4717-83cf-5a55021f89b0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.713 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.756 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.800 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.801 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.828 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.834 183755 DEBUG nova.virt.libvirt.vif [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:30:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2039109839',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2039109839',id=14,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:30:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f5e7d0ea0cbe4c4f9e5e1490f6110cec',ramdisk_id='',reservation_id='r-wrzr4otu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-567132824-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:30:56Z,user_data=None,user_id='c1895fcc128e498c8220a133604740e8',uuid=8ef7d52b-ca91-4353-ba68-c3e66978c93d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.835 183755 DEBUG nova.network.os_vif_util [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Converting VIF {"id": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "address": "fa:16:3e:5f:14:3a", "network": {"id": "bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-296801654-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf213dc42fc04bdbb0c321ce189cfdb8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0e0382c-6a", "ovs_interfaceid": "f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.837 183755 DEBUG nova.network.os_vif_util [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:3a,bridge_name='br-int',has_traffic_filtering=True,id=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0,network=Network(bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e0382c-6a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.837 183755 DEBUG os_vif [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:3a,bridge_name='br-int',has_traffic_filtering=True,id=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0,network=Network(bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e0382c-6a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.840 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.841 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0e0382c-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.843 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.844 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.846 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.846 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e58ca302-75d9-44a3-88ce-c17c2df20973) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.847 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.849 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.852 183755 INFO os_vif [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:3a,bridge_name='br-int',has_traffic_filtering=True,id=f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0,network=Network(bfdc4bdd-0f80-455a-bcbb-6d76b2cab9c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0e0382c-6a')
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.853 183755 INFO nova.virt.libvirt.driver [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Deleting instance files /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d_del
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.854 183755 INFO nova.virt.libvirt.driver [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Deletion of /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d_del complete
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.886 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk --force-share --output=json" returned: 1 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.886 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] '/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk --force-share --output=json' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Jan 27 22:31:01 compute-1 nova_compute[183751]: 2026-01-27 22:31:01.887 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000000e, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/8ef7d52b-ca91-4353-ba68-c3e66978c93d/disk
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.091 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.093 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.125 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.126 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5685MB free_disk=73.14170837402344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.127 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.127 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.370 183755 INFO nova.compute.manager [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Took 1.34 seconds to destroy the instance on the hypervisor.
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.371 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.371 183755 DEBUG nova.compute.manager [-] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.371 183755 DEBUG nova.network.neutron [-] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.372 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:31:02 compute-1 nova_compute[183751]: 2026-01-27 22:31:02.566 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.350 183755 DEBUG nova.network.neutron [-] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.365 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Instance 8ef7d52b-ca91-4353-ba68-c3e66978c93d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.365 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.365 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:31:02 up  2:33,  0 user,  load average: 0.49, 0.32, 0.18\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_f5e7d0ea0cbe4c4f9e5e1490f6110cec': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.386 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.398 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.398 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.405 183755 DEBUG nova.compute.manager [req-5584c35d-710b-4a2a-bcf3-623d52e004cc req-d71daafb-beb1-40eb-9b2a-7b05c9c0c127 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-vif-unplugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.405 183755 DEBUG oslo_concurrency.lockutils [req-5584c35d-710b-4a2a-bcf3-623d52e004cc req-d71daafb-beb1-40eb-9b2a-7b05c9c0c127 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.405 183755 DEBUG oslo_concurrency.lockutils [req-5584c35d-710b-4a2a-bcf3-623d52e004cc req-d71daafb-beb1-40eb-9b2a-7b05c9c0c127 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.406 183755 DEBUG oslo_concurrency.lockutils [req-5584c35d-710b-4a2a-bcf3-623d52e004cc req-d71daafb-beb1-40eb-9b2a-7b05c9c0c127 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.406 183755 DEBUG nova.compute.manager [req-5584c35d-710b-4a2a-bcf3-623d52e004cc req-d71daafb-beb1-40eb-9b2a-7b05c9c0c127 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] No waiting events found dispatching network-vif-unplugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.406 183755 DEBUG nova.compute.manager [req-5584c35d-710b-4a2a-bcf3-623d52e004cc req-d71daafb-beb1-40eb-9b2a-7b05c9c0c127 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-vif-unplugged-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.407 183755 DEBUG nova.compute.manager [req-5584c35d-710b-4a2a-bcf3-623d52e004cc req-d71daafb-beb1-40eb-9b2a-7b05c9c0c127 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Received event network-vif-deleted-f0e0382c-6a3e-4c45-962d-b0c8b8fc40b0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.409 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.429 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.472 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.856 183755 INFO nova.compute.manager [-] [instance: 8ef7d52b-ca91-4353-ba68-c3e66978c93d] Took 1.48 seconds to deallocate network for instance.
Jan 27 22:31:03 compute-1 nova_compute[183751]: 2026-01-27 22:31:03.979 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:31:04 compute-1 nova_compute[183751]: 2026-01-27 22:31:04.388 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:04 compute-1 nova_compute[183751]: 2026-01-27 22:31:04.495 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:31:04 compute-1 nova_compute[183751]: 2026-01-27 22:31:04.496 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.369s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:04 compute-1 nova_compute[183751]: 2026-01-27 22:31:04.496 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:04 compute-1 nova_compute[183751]: 2026-01-27 22:31:04.568 183755 DEBUG nova.compute.provider_tree [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.077 183755 DEBUG nova.scheduler.client.report [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.493 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.493 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.494 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.494 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.495 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.495 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.495 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.589 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:05 compute-1 nova_compute[183751]: 2026-01-27 22:31:05.643 183755 INFO nova.scheduler.client.report [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Deleted allocations for instance 8ef7d52b-ca91-4353-ba68-c3e66978c93d
Jan 27 22:31:05 compute-1 podman[193064]: time="2026-01-27T22:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:31:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:31:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 27 22:31:06 compute-1 nova_compute[183751]: 2026-01-27 22:31:06.681 183755 DEBUG oslo_concurrency.lockutils [None req-2d195e4a-d6d3-4154-b843-c2ef4c59924a c1895fcc128e498c8220a133604740e8 f5e7d0ea0cbe4c4f9e5e1490f6110cec - - default default] Lock "8ef7d52b-ca91-4353-ba68-c3e66978c93d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.180s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:06 compute-1 nova_compute[183751]: 2026-01-27 22:31:06.831 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:06 compute-1 nova_compute[183751]: 2026-01-27 22:31:06.848 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:07 compute-1 nova_compute[183751]: 2026-01-27 22:31:07.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.149 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.150 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.150 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.151 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.151 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.152 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:11.280 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:31:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:11.281 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:31:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:11.281 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.833 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:11 compute-1 nova_compute[183751]: 2026-01-27 22:31:11.850 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:12 compute-1 nova_compute[183751]: 2026-01-27 22:31:12.171 183755 DEBUG nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 27 22:31:12 compute-1 nova_compute[183751]: 2026-01-27 22:31:12.172 183755 WARNING nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd
Jan 27 22:31:12 compute-1 nova_compute[183751]: 2026-01-27 22:31:12.172 183755 INFO nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Removable base files: /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd
Jan 27 22:31:12 compute-1 nova_compute[183751]: 2026-01-27 22:31:12.173 183755 INFO nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd
Jan 27 22:31:12 compute-1 nova_compute[183751]: 2026-01-27 22:31:12.174 183755 DEBUG nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 27 22:31:12 compute-1 nova_compute[183751]: 2026-01-27 22:31:12.174 183755 DEBUG nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 27 22:31:12 compute-1 nova_compute[183751]: 2026-01-27 22:31:12.174 183755 DEBUG nova.virt.libvirt.imagecache [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 27 22:31:13 compute-1 podman[218900]: 2026-01-27 22:31:13.82392025 +0000 UTC m=+0.124417684 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Jan 27 22:31:16 compute-1 nova_compute[183751]: 2026-01-27 22:31:16.835 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:16 compute-1 nova_compute[183751]: 2026-01-27 22:31:16.851 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:18 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:18.514 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:31:18 compute-1 nova_compute[183751]: 2026-01-27 22:31:18.514 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:18 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:18.515 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:31:19 compute-1 openstack_network_exporter[195945]: ERROR   22:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:31:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:31:19 compute-1 openstack_network_exporter[195945]: ERROR   22:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:31:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:31:19 compute-1 podman[218928]: 2026-01-27 22:31:19.806143474 +0000 UTC m=+0.095836728 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Jan 27 22:31:19 compute-1 podman[218927]: 2026-01-27 22:31:19.847316531 +0000 UTC m=+0.136979574 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git)
Jan 27 22:31:20 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:20.517 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:31:21 compute-1 nova_compute[183751]: 2026-01-27 22:31:21.838 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:21 compute-1 nova_compute[183751]: 2026-01-27 22:31:21.853 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:22 compute-1 nova_compute[183751]: 2026-01-27 22:31:22.764 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:26 compute-1 nova_compute[183751]: 2026-01-27 22:31:26.840 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:26 compute-1 nova_compute[183751]: 2026-01-27 22:31:26.853 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:28 compute-1 podman[218962]: 2026-01-27 22:31:28.774859738 +0000 UTC m=+0.073262890 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:31:31 compute-1 nova_compute[183751]: 2026-01-27 22:31:31.843 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:31 compute-1 nova_compute[183751]: 2026-01-27 22:31:31.855 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:35 compute-1 podman[193064]: time="2026-01-27T22:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:31:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:31:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:31:36 compute-1 nova_compute[183751]: 2026-01-27 22:31:36.846 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:36 compute-1 nova_compute[183751]: 2026-01-27 22:31:36.855 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:38 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:38.945 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:4c:4a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f9204588-8074-4f71-a651-f5cf3fa37239', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9204588-8074-4f71-a651-f5cf3fa37239', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab442916dd3949bfb3e3aef0119bc7c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=35405834-d21a-4cb7-97d6-657f196f8162, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d04f921d-0f7b-4cd0-ba3a-2fe6e14bd90f) old=Port_Binding(mac=['fa:16:3e:e0:4c:4a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f9204588-8074-4f71-a651-f5cf3fa37239', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9204588-8074-4f71-a651-f5cf3fa37239', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab442916dd3949bfb3e3aef0119bc7c0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:31:38 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:38.946 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d04f921d-0f7b-4cd0-ba3a-2fe6e14bd90f in datapath f9204588-8074-4f71-a651-f5cf3fa37239 updated
Jan 27 22:31:38 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:38.947 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9204588-8074-4f71-a651-f5cf3fa37239, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:31:38 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:38.948 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[9d070d30-8050-4c05-afdb-9c026630c74e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:41 compute-1 nova_compute[183751]: 2026-01-27 22:31:41.847 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:41 compute-1 nova_compute[183751]: 2026-01-27 22:31:41.856 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:44 compute-1 podman[218989]: 2026-01-27 22:31:44.848445566 +0000 UTC m=+0.152323514 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:31:46 compute-1 nova_compute[183751]: 2026-01-27 22:31:46.849 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:46 compute-1 nova_compute[183751]: 2026-01-27 22:31:46.858 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:49 compute-1 openstack_network_exporter[195945]: ERROR   22:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:31:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:31:49 compute-1 openstack_network_exporter[195945]: ERROR   22:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:31:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:31:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:49.676 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:06:35 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a2f3319b-cd20-495a-848c-7f74eb26327d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2f3319b-cd20-495a-848c-7f74eb26327d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68bc9cd32fd447c49d449bee2feb39e4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=011bf9ef-548a-4c97-b110-3935231e029a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=075d8cf1-f7d4-40ce-83ec-7168d0e4f80b) old=Port_Binding(mac=['fa:16:3e:e0:06:35'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a2f3319b-cd20-495a-848c-7f74eb26327d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2f3319b-cd20-495a-848c-7f74eb26327d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68bc9cd32fd447c49d449bee2feb39e4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:31:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:49.677 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 075d8cf1-f7d4-40ce-83ec-7168d0e4f80b in datapath a2f3319b-cd20-495a-848c-7f74eb26327d updated
Jan 27 22:31:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:49.678 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2f3319b-cd20-495a-848c-7f74eb26327d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:31:49 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:31:49.679 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[72f52370-6618-4904-92eb-dea806fbad5f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:31:50 compute-1 podman[219016]: 2026-01-27 22:31:50.772566685 +0000 UTC m=+0.079730370 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:31:50 compute-1 podman[219017]: 2026-01-27 22:31:50.809431966 +0000 UTC m=+0.104980964 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 22:31:51 compute-1 nova_compute[183751]: 2026-01-27 22:31:51.851 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:51 compute-1 nova_compute[183751]: 2026-01-27 22:31:51.860 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:56 compute-1 nova_compute[183751]: 2026-01-27 22:31:56.852 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:56 compute-1 nova_compute[183751]: 2026-01-27 22:31:56.860 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:31:59 compute-1 nova_compute[183751]: 2026-01-27 22:31:59.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:31:59 compute-1 nova_compute[183751]: 2026-01-27 22:31:59.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:31:59 compute-1 nova_compute[183751]: 2026-01-27 22:31:59.656 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:31:59 compute-1 podman[219052]: 2026-01-27 22:31:59.774834707 +0000 UTC m=+0.085785850 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:31:59 compute-1 ovn_controller[95969]: 2026-01-27T22:31:59Z|00100|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 27 22:32:01 compute-1 nova_compute[183751]: 2026-01-27 22:32:01.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:01 compute-1 nova_compute[183751]: 2026-01-27 22:32:01.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:01 compute-1 nova_compute[183751]: 2026-01-27 22:32:01.855 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:01 compute-1 nova_compute[183751]: 2026-01-27 22:32:01.861 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.170 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.170 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.171 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.171 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.394 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.396 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.413 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.414 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5856MB free_disk=73.14260482788086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.414 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:32:02 compute-1 nova_compute[183751]: 2026-01-27 22:32:02.415 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:32:03 compute-1 nova_compute[183751]: 2026-01-27 22:32:03.470 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:32:03 compute-1 nova_compute[183751]: 2026-01-27 22:32:03.471 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:32:02 up  2:34,  0 user,  load average: 0.22, 0.28, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:32:03 compute-1 nova_compute[183751]: 2026-01-27 22:32:03.542 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:32:04 compute-1 nova_compute[183751]: 2026-01-27 22:32:04.052 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:32:04 compute-1 nova_compute[183751]: 2026-01-27 22:32:04.564 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:32:04 compute-1 nova_compute[183751]: 2026-01-27 22:32:04.565 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.150s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:32:05 compute-1 podman[193064]: time="2026-01-27T22:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:32:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:32:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2185 "" "Go-http-client/1.1"
Jan 27 22:32:06 compute-1 nova_compute[183751]: 2026-01-27 22:32:06.057 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:06 compute-1 nova_compute[183751]: 2026-01-27 22:32:06.058 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:06 compute-1 nova_compute[183751]: 2026-01-27 22:32:06.058 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:06 compute-1 nova_compute[183751]: 2026-01-27 22:32:06.059 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:06 compute-1 nova_compute[183751]: 2026-01-27 22:32:06.059 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:32:06 compute-1 nova_compute[183751]: 2026-01-27 22:32:06.858 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:06 compute-1 nova_compute[183751]: 2026-01-27 22:32:06.863 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:07 compute-1 nova_compute[183751]: 2026-01-27 22:32:07.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:07 compute-1 nova_compute[183751]: 2026-01-27 22:32:07.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:32:11.284 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:32:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:32:11.284 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:32:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:32:11.284 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:32:11 compute-1 nova_compute[183751]: 2026-01-27 22:32:11.861 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:11 compute-1 nova_compute[183751]: 2026-01-27 22:32:11.863 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:15 compute-1 podman[219079]: 2026-01-27 22:32:15.826480593 +0000 UTC m=+0.128948715 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:32:16 compute-1 nova_compute[183751]: 2026-01-27 22:32:16.865 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:17 compute-1 nova_compute[183751]: 2026-01-27 22:32:17.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:17 compute-1 nova_compute[183751]: 2026-01-27 22:32:17.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:32:19 compute-1 openstack_network_exporter[195945]: ERROR   22:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:32:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:32:19 compute-1 openstack_network_exporter[195945]: ERROR   22:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:32:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:32:21 compute-1 podman[219106]: 2026-01-27 22:32:21.771314075 +0000 UTC m=+0.072205524 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 27 22:32:21 compute-1 podman[219105]: 2026-01-27 22:32:21.786173322 +0000 UTC m=+0.097852857 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64)
Jan 27 22:32:21 compute-1 nova_compute[183751]: 2026-01-27 22:32:21.867 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:21 compute-1 nova_compute[183751]: 2026-01-27 22:32:21.871 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:26 compute-1 nova_compute[183751]: 2026-01-27 22:32:26.868 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:26 compute-1 nova_compute[183751]: 2026-01-27 22:32:26.871 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:30 compute-1 podman[219146]: 2026-01-27 22:32:30.789093228 +0000 UTC m=+0.092078115 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 27 22:32:31 compute-1 nova_compute[183751]: 2026-01-27 22:32:31.650 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:31 compute-1 nova_compute[183751]: 2026-01-27 22:32:31.872 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:32:31 compute-1 nova_compute[183751]: 2026-01-27 22:32:31.874 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:32:31 compute-1 nova_compute[183751]: 2026-01-27 22:32:31.875 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:32:31 compute-1 nova_compute[183751]: 2026-01-27 22:32:31.875 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:32:31 compute-1 nova_compute[183751]: 2026-01-27 22:32:31.923 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:31 compute-1 nova_compute[183751]: 2026-01-27 22:32:31.924 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:32:34 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:32:34.780 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:32:34 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:32:34.782 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:32:34 compute-1 nova_compute[183751]: 2026-01-27 22:32:34.782 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:35 compute-1 podman[193064]: time="2026-01-27T22:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:32:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:32:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 27 22:32:36 compute-1 nova_compute[183751]: 2026-01-27 22:32:36.925 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:41 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:32:41.785 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:32:41 compute-1 nova_compute[183751]: 2026-01-27 22:32:41.927 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:32:43 compute-1 nova_compute[183751]: 2026-01-27 22:32:43.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:32:46 compute-1 podman[219171]: 2026-01-27 22:32:46.850008863 +0000 UTC m=+0.154520148 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 27 22:32:46 compute-1 nova_compute[183751]: 2026-01-27 22:32:46.930 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:32:49 compute-1 openstack_network_exporter[195945]: ERROR   22:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:32:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:32:49 compute-1 openstack_network_exporter[195945]: ERROR   22:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:32:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:32:51 compute-1 nova_compute[183751]: 2026-01-27 22:32:51.932 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:51 compute-1 nova_compute[183751]: 2026-01-27 22:32:51.934 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:32:52 compute-1 podman[219198]: 2026-01-27 22:32:52.769799776 +0000 UTC m=+0.071083567 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 27 22:32:52 compute-1 podman[219197]: 2026-01-27 22:32:52.806649166 +0000 UTC m=+0.113009003 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Jan 27 22:32:56 compute-1 nova_compute[183751]: 2026-01-27 22:32:56.936 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:01 compute-1 nova_compute[183751]: 2026-01-27 22:33:01.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:01 compute-1 nova_compute[183751]: 2026-01-27 22:33:01.656 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:01 compute-1 podman[219238]: 2026-01-27 22:33:01.791316355 +0000 UTC m=+0.094006723 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:33:01 compute-1 nova_compute[183751]: 2026-01-27 22:33:01.938 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.170 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.171 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.172 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.172 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.394 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.396 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.438 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.439 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5853MB free_disk=73.14260482788086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.439 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:33:02 compute-1 nova_compute[183751]: 2026-01-27 22:33:02.439 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:33:03 compute-1 nova_compute[183751]: 2026-01-27 22:33:03.488 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:33:03 compute-1 nova_compute[183751]: 2026-01-27 22:33:03.488 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:33:02 up  2:35,  0 user,  load average: 0.08, 0.22, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:33:03 compute-1 nova_compute[183751]: 2026-01-27 22:33:03.520 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:33:04 compute-1 nova_compute[183751]: 2026-01-27 22:33:04.030 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:33:04 compute-1 nova_compute[183751]: 2026-01-27 22:33:04.542 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:33:04 compute-1 nova_compute[183751]: 2026-01-27 22:33:04.542 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:33:05 compute-1 podman[193064]: time="2026-01-27T22:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:33:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:33:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 27 22:33:06 compute-1 nova_compute[183751]: 2026-01-27 22:33:06.940 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:07 compute-1 nova_compute[183751]: 2026-01-27 22:33:07.031 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:07 compute-1 nova_compute[183751]: 2026-01-27 22:33:07.032 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:07 compute-1 nova_compute[183751]: 2026-01-27 22:33:07.033 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:07 compute-1 nova_compute[183751]: 2026-01-27 22:33:07.034 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:07 compute-1 nova_compute[183751]: 2026-01-27 22:33:07.034 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:33:07 compute-1 nova_compute[183751]: 2026-01-27 22:33:07.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:07 compute-1 nova_compute[183751]: 2026-01-27 22:33:07.786 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:09 compute-1 nova_compute[183751]: 2026-01-27 22:33:09.659 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:33:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:33:11.285 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:33:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:33:11.286 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:33:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:33:11.286 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:33:11 compute-1 nova_compute[183751]: 2026-01-27 22:33:11.941 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:11 compute-1 nova_compute[183751]: 2026-01-27 22:33:11.944 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:16 compute-1 nova_compute[183751]: 2026-01-27 22:33:16.945 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:17 compute-1 podman[219264]: 2026-01-27 22:33:17.877146255 +0000 UTC m=+0.183654647 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 27 22:33:19 compute-1 openstack_network_exporter[195945]: ERROR   22:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:33:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:33:19 compute-1 openstack_network_exporter[195945]: ERROR   22:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:33:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:33:21 compute-1 nova_compute[183751]: 2026-01-27 22:33:21.946 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:21 compute-1 nova_compute[183751]: 2026-01-27 22:33:21.948 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:21 compute-1 nova_compute[183751]: 2026-01-27 22:33:21.948 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:33:21 compute-1 nova_compute[183751]: 2026-01-27 22:33:21.948 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:21 compute-1 nova_compute[183751]: 2026-01-27 22:33:21.983 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:21 compute-1 nova_compute[183751]: 2026-01-27 22:33:21.984 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:23 compute-1 podman[219290]: 2026-01-27 22:33:23.802962368 +0000 UTC m=+0.102452822 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., config_id=openstack_network_exporter)
Jan 27 22:33:23 compute-1 podman[219291]: 2026-01-27 22:33:23.81276139 +0000 UTC m=+0.107659161 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Jan 27 22:33:26 compute-1 nova_compute[183751]: 2026-01-27 22:33:26.986 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:26 compute-1 nova_compute[183751]: 2026-01-27 22:33:26.987 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:26 compute-1 nova_compute[183751]: 2026-01-27 22:33:26.987 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:33:26 compute-1 nova_compute[183751]: 2026-01-27 22:33:26.988 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:27 compute-1 nova_compute[183751]: 2026-01-27 22:33:27.027 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:27 compute-1 nova_compute[183751]: 2026-01-27 22:33:27.027 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:30 compute-1 sshd-session[219332]: Invalid user solana from 80.94.92.186 port 35156
Jan 27 22:33:30 compute-1 sshd-session[219332]: Connection closed by invalid user solana 80.94.92.186 port 35156 [preauth]
Jan 27 22:33:32 compute-1 nova_compute[183751]: 2026-01-27 22:33:32.029 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:32 compute-1 nova_compute[183751]: 2026-01-27 22:33:32.030 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:32 compute-1 nova_compute[183751]: 2026-01-27 22:33:32.031 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:33:32 compute-1 nova_compute[183751]: 2026-01-27 22:33:32.031 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:32 compute-1 nova_compute[183751]: 2026-01-27 22:33:32.126 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:32 compute-1 nova_compute[183751]: 2026-01-27 22:33:32.128 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:32 compute-1 podman[219334]: 2026-01-27 22:33:32.773486515 +0000 UTC m=+0.080254573 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:33:35 compute-1 podman[193064]: time="2026-01-27T22:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:33:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:33:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 27 22:33:37 compute-1 nova_compute[183751]: 2026-01-27 22:33:37.129 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:37 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:33:37.216 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:33:37 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:33:37.217 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:33:37 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:33:37.218 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:33:37 compute-1 nova_compute[183751]: 2026-01-27 22:33:37.220 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:42 compute-1 nova_compute[183751]: 2026-01-27 22:33:42.131 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:47 compute-1 nova_compute[183751]: 2026-01-27 22:33:47.138 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:47 compute-1 nova_compute[183751]: 2026-01-27 22:33:47.140 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:48 compute-1 podman[219359]: 2026-01-27 22:33:48.853172545 +0000 UTC m=+0.151207485 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 22:33:49 compute-1 openstack_network_exporter[195945]: ERROR   22:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:33:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:33:49 compute-1 openstack_network_exporter[195945]: ERROR   22:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:33:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:33:52 compute-1 nova_compute[183751]: 2026-01-27 22:33:52.139 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:52 compute-1 nova_compute[183751]: 2026-01-27 22:33:52.141 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:33:52 compute-1 nova_compute[183751]: 2026-01-27 22:33:52.142 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:33:52 compute-1 nova_compute[183751]: 2026-01-27 22:33:52.142 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:52 compute-1 nova_compute[183751]: 2026-01-27 22:33:52.143 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:33:52 compute-1 nova_compute[183751]: 2026-01-27 22:33:52.144 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:33:54 compute-1 podman[219385]: 2026-01-27 22:33:54.798122899 +0000 UTC m=+0.097262064 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 27 22:33:54 compute-1 podman[219386]: 2026-01-27 22:33:54.80099695 +0000 UTC m=+0.094650609 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2)
Jan 27 22:33:57 compute-1 nova_compute[183751]: 2026-01-27 22:33:57.144 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:02 compute-1 nova_compute[183751]: 2026-01-27 22:34:02.147 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:02 compute-1 nova_compute[183751]: 2026-01-27 22:34:02.150 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:02 compute-1 nova_compute[183751]: 2026-01-27 22:34:02.150 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:02 compute-1 nova_compute[183751]: 2026-01-27 22:34:02.150 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:02 compute-1 nova_compute[183751]: 2026-01-27 22:34:02.186 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:02 compute-1 nova_compute[183751]: 2026-01-27 22:34:02.187 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.667 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:34:03 compute-1 podman[219422]: 2026-01-27 22:34:03.791708505 +0000 UTC m=+0.095873628 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.919 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.922 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.962 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.963 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5856MB free_disk=73.13686752319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.963 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:34:03 compute-1 nova_compute[183751]: 2026-01-27 22:34:03.964 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:34:05 compute-1 nova_compute[183751]: 2026-01-27 22:34:05.067 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:34:05 compute-1 nova_compute[183751]: 2026-01-27 22:34:05.068 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:34:03 up  2:36,  0 user,  load average: 0.08, 0.20, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:34:05 compute-1 nova_compute[183751]: 2026-01-27 22:34:05.089 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:34:05 compute-1 nova_compute[183751]: 2026-01-27 22:34:05.598 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:34:05 compute-1 podman[193064]: time="2026-01-27T22:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:34:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:34:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2175 "" "Go-http-client/1.1"
Jan 27 22:34:06 compute-1 nova_compute[183751]: 2026-01-27 22:34:06.122 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:34:06 compute-1 nova_compute[183751]: 2026-01-27 22:34:06.123 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:34:07 compute-1 nova_compute[183751]: 2026-01-27 22:34:07.187 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:07 compute-1 nova_compute[183751]: 2026-01-27 22:34:07.189 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:07 compute-1 nova_compute[183751]: 2026-01-27 22:34:07.190 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:07 compute-1 nova_compute[183751]: 2026-01-27 22:34:07.190 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:07 compute-1 nova_compute[183751]: 2026-01-27 22:34:07.241 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:07 compute-1 nova_compute[183751]: 2026-01-27 22:34:07.242 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:08 compute-1 nova_compute[183751]: 2026-01-27 22:34:08.119 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:08 compute-1 nova_compute[183751]: 2026-01-27 22:34:08.120 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:08 compute-1 nova_compute[183751]: 2026-01-27 22:34:08.120 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:08 compute-1 nova_compute[183751]: 2026-01-27 22:34:08.121 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:08 compute-1 nova_compute[183751]: 2026-01-27 22:34:08.121 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:34:09 compute-1 nova_compute[183751]: 2026-01-27 22:34:09.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:09 compute-1 nova_compute[183751]: 2026-01-27 22:34:09.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:11.288 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:34:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:11.288 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:34:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:11.288 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:34:12 compute-1 nova_compute[183751]: 2026-01-27 22:34:12.243 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:17 compute-1 nova_compute[183751]: 2026-01-27 22:34:17.245 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:17 compute-1 nova_compute[183751]: 2026-01-27 22:34:17.248 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:17 compute-1 nova_compute[183751]: 2026-01-27 22:34:17.248 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:17 compute-1 nova_compute[183751]: 2026-01-27 22:34:17.249 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:17 compute-1 nova_compute[183751]: 2026-01-27 22:34:17.283 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:17 compute-1 nova_compute[183751]: 2026-01-27 22:34:17.284 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:19 compute-1 openstack_network_exporter[195945]: ERROR   22:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:34:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:34:19 compute-1 openstack_network_exporter[195945]: ERROR   22:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:34:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:34:19 compute-1 podman[219449]: 2026-01-27 22:34:19.852923586 +0000 UTC m=+0.148363226 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 27 22:34:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:21.326 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:b3:0c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-96689921-9f22-479c-9093-ea866db28570', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96689921-9f22-479c-9093-ea866db28570', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ad2fc2dd7fe483fb3baa3f7a8674052', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=354609ca-a8cb-44d6-826d-ca63cdf80446, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b69b64a0-4426-48ed-850e-798cf6069ee0) old=Port_Binding(mac=['fa:16:3e:0a:b3:0c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-96689921-9f22-479c-9093-ea866db28570', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-96689921-9f22-479c-9093-ea866db28570', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ad2fc2dd7fe483fb3baa3f7a8674052', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:34:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:21.327 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b69b64a0-4426-48ed-850e-798cf6069ee0 in datapath 96689921-9f22-479c-9093-ea866db28570 updated
Jan 27 22:34:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:21.328 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 96689921-9f22-479c-9093-ea866db28570, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:34:21 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:21.329 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd166a6-179f-4a98-bf47-190f216f9e78]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:34:22 compute-1 nova_compute[183751]: 2026-01-27 22:34:22.284 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:25 compute-1 podman[219478]: 2026-01-27 22:34:25.777519939 +0000 UTC m=+0.072033401 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:34:25 compute-1 podman[219477]: 2026-01-27 22:34:25.795198225 +0000 UTC m=+0.092762482 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Jan 27 22:34:27 compute-1 nova_compute[183751]: 2026-01-27 22:34:27.286 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:27 compute-1 nova_compute[183751]: 2026-01-27 22:34:27.288 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:27 compute-1 nova_compute[183751]: 2026-01-27 22:34:27.288 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:27 compute-1 nova_compute[183751]: 2026-01-27 22:34:27.288 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:27 compute-1 nova_compute[183751]: 2026-01-27 22:34:27.288 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:27 compute-1 nova_compute[183751]: 2026-01-27 22:34:27.290 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:32 compute-1 nova_compute[183751]: 2026-01-27 22:34:32.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:34:32 compute-1 nova_compute[183751]: 2026-01-27 22:34:32.291 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:32 compute-1 nova_compute[183751]: 2026-01-27 22:34:32.293 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:32 compute-1 nova_compute[183751]: 2026-01-27 22:34:32.294 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:32 compute-1 nova_compute[183751]: 2026-01-27 22:34:32.294 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:32 compute-1 nova_compute[183751]: 2026-01-27 22:34:32.320 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:32 compute-1 nova_compute[183751]: 2026-01-27 22:34:32.321 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:33.252 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:1d:c1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b463b6f5-cf24-4067-bc9d-7129eb15bebd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b463b6f5-cf24-4067-bc9d-7129eb15bebd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82f5eb2628574b7f8d17963abaa22290', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a02554f9-c5f2-448b-af40-b514d42067a0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=91bb1cc1-6810-4d58-a054-5287d2687f5d) old=Port_Binding(mac=['fa:16:3e:87:1d:c1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b463b6f5-cf24-4067-bc9d-7129eb15bebd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b463b6f5-cf24-4067-bc9d-7129eb15bebd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82f5eb2628574b7f8d17963abaa22290', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:34:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:33.254 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 91bb1cc1-6810-4d58-a054-5287d2687f5d in datapath b463b6f5-cf24-4067-bc9d-7129eb15bebd updated
Jan 27 22:34:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:33.255 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b463b6f5-cf24-4067-bc9d-7129eb15bebd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:34:33 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:33.256 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f9174ef1-e5b1-40b1-ae67-14d5b9d23846]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:34:34 compute-1 podman[219517]: 2026-01-27 22:34:34.757078731 +0000 UTC m=+0.066783441 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:34:35 compute-1 podman[193064]: time="2026-01-27T22:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:34:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:34:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 27 22:34:37 compute-1 nova_compute[183751]: 2026-01-27 22:34:37.322 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:37 compute-1 nova_compute[183751]: 2026-01-27 22:34:37.324 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:37 compute-1 nova_compute[183751]: 2026-01-27 22:34:37.325 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:37 compute-1 nova_compute[183751]: 2026-01-27 22:34:37.325 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:37 compute-1 nova_compute[183751]: 2026-01-27 22:34:37.364 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:37 compute-1 nova_compute[183751]: 2026-01-27 22:34:37.365 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:42 compute-1 nova_compute[183751]: 2026-01-27 22:34:42.366 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:42 compute-1 nova_compute[183751]: 2026-01-27 22:34:42.368 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:42 compute-1 nova_compute[183751]: 2026-01-27 22:34:42.368 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:42 compute-1 nova_compute[183751]: 2026-01-27 22:34:42.368 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:42 compute-1 nova_compute[183751]: 2026-01-27 22:34:42.402 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:42 compute-1 nova_compute[183751]: 2026-01-27 22:34:42.402 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:47 compute-1 nova_compute[183751]: 2026-01-27 22:34:47.404 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:47 compute-1 nova_compute[183751]: 2026-01-27 22:34:47.406 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:34:47 compute-1 nova_compute[183751]: 2026-01-27 22:34:47.406 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:34:47 compute-1 nova_compute[183751]: 2026-01-27 22:34:47.407 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:47 compute-1 nova_compute[183751]: 2026-01-27 22:34:47.441 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:47 compute-1 nova_compute[183751]: 2026-01-27 22:34:47.441 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:34:49 compute-1 openstack_network_exporter[195945]: ERROR   22:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:34:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:34:49 compute-1 openstack_network_exporter[195945]: ERROR   22:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:34:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:34:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:50.074 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:34:50 compute-1 nova_compute[183751]: 2026-01-27 22:34:50.075 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:50 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:50.075 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:34:50 compute-1 podman[219544]: 2026-01-27 22:34:50.837761533 +0000 UTC m=+0.135292193 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 27 22:34:52 compute-1 nova_compute[183751]: 2026-01-27 22:34:52.492 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:34:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:34:53.077 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:34:56 compute-1 podman[219573]: 2026-01-27 22:34:56.79739204 +0000 UTC m=+0.095142321 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:34:56 compute-1 podman[219574]: 2026-01-27 22:34:56.800256291 +0000 UTC m=+0.090413725 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260126)
Jan 27 22:34:57 compute-1 nova_compute[183751]: 2026-01-27 22:34:57.494 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:02 compute-1 nova_compute[183751]: 2026-01-27 22:35:02.496 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:02 compute-1 nova_compute[183751]: 2026-01-27 22:35:02.498 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:02 compute-1 nova_compute[183751]: 2026-01-27 22:35:02.498 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:02 compute-1 nova_compute[183751]: 2026-01-27 22:35:02.498 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:02 compute-1 nova_compute[183751]: 2026-01-27 22:35:02.537 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:02 compute-1 nova_compute[183751]: 2026-01-27 22:35:02.538 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.665 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.666 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.903 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.905 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.930 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.931 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5853MB free_disk=73.13686752319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.931 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:35:03 compute-1 nova_compute[183751]: 2026-01-27 22:35:03.932 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:35:04 compute-1 nova_compute[183751]: 2026-01-27 22:35:04.998 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:35:04 compute-1 nova_compute[183751]: 2026-01-27 22:35:04.998 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:35:03 up  2:37,  0 user,  load average: 0.03, 0.16, 0.14\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:35:05 compute-1 nova_compute[183751]: 2026-01-27 22:35:05.211 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:35:05 compute-1 podman[193064]: time="2026-01-27T22:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:35:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:35:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:35:05 compute-1 nova_compute[183751]: 2026-01-27 22:35:05.721 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:35:05 compute-1 podman[219615]: 2026-01-27 22:35:05.803460926 +0000 UTC m=+0.105460716 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:35:06 compute-1 nova_compute[183751]: 2026-01-27 22:35:06.234 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:35:06 compute-1 nova_compute[183751]: 2026-01-27 22:35:06.234 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.302s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:35:07 compute-1 nova_compute[183751]: 2026-01-27 22:35:07.539 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:07 compute-1 nova_compute[183751]: 2026-01-27 22:35:07.541 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:07 compute-1 nova_compute[183751]: 2026-01-27 22:35:07.542 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:07 compute-1 nova_compute[183751]: 2026-01-27 22:35:07.542 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:07 compute-1 nova_compute[183751]: 2026-01-27 22:35:07.577 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:07 compute-1 nova_compute[183751]: 2026-01-27 22:35:07.578 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:09 compute-1 nova_compute[183751]: 2026-01-27 22:35:09.230 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:09 compute-1 nova_compute[183751]: 2026-01-27 22:35:09.231 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:09 compute-1 nova_compute[183751]: 2026-01-27 22:35:09.231 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:09 compute-1 nova_compute[183751]: 2026-01-27 22:35:09.231 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:09 compute-1 nova_compute[183751]: 2026-01-27 22:35:09.231 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:35:11 compute-1 nova_compute[183751]: 2026-01-27 22:35:11.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:11 compute-1 nova_compute[183751]: 2026-01-27 22:35:11.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:35:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:35:11.289 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:35:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:35:11.289 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:35:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:35:11.289 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:35:12 compute-1 nova_compute[183751]: 2026-01-27 22:35:12.579 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:12 compute-1 nova_compute[183751]: 2026-01-27 22:35:12.581 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:12 compute-1 nova_compute[183751]: 2026-01-27 22:35:12.581 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:12 compute-1 nova_compute[183751]: 2026-01-27 22:35:12.582 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:12 compute-1 nova_compute[183751]: 2026-01-27 22:35:12.615 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:12 compute-1 nova_compute[183751]: 2026-01-27 22:35:12.616 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:17 compute-1 nova_compute[183751]: 2026-01-27 22:35:17.617 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:17 compute-1 nova_compute[183751]: 2026-01-27 22:35:17.619 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:17 compute-1 nova_compute[183751]: 2026-01-27 22:35:17.619 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:17 compute-1 nova_compute[183751]: 2026-01-27 22:35:17.620 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:17 compute-1 nova_compute[183751]: 2026-01-27 22:35:17.663 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:17 compute-1 nova_compute[183751]: 2026-01-27 22:35:17.664 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:19 compute-1 openstack_network_exporter[195945]: ERROR   22:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:35:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:35:19 compute-1 openstack_network_exporter[195945]: ERROR   22:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:35:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:35:21 compute-1 podman[219641]: 2026-01-27 22:35:21.835335343 +0000 UTC m=+0.131552881 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:35:22 compute-1 nova_compute[183751]: 2026-01-27 22:35:22.665 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:22 compute-1 nova_compute[183751]: 2026-01-27 22:35:22.667 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:22 compute-1 nova_compute[183751]: 2026-01-27 22:35:22.667 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:22 compute-1 nova_compute[183751]: 2026-01-27 22:35:22.668 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:22 compute-1 nova_compute[183751]: 2026-01-27 22:35:22.701 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:22 compute-1 nova_compute[183751]: 2026-01-27 22:35:22.701 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:27 compute-1 nova_compute[183751]: 2026-01-27 22:35:27.702 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:27 compute-1 nova_compute[183751]: 2026-01-27 22:35:27.704 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:27 compute-1 nova_compute[183751]: 2026-01-27 22:35:27.704 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:27 compute-1 nova_compute[183751]: 2026-01-27 22:35:27.704 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:27 compute-1 nova_compute[183751]: 2026-01-27 22:35:27.716 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:27 compute-1 nova_compute[183751]: 2026-01-27 22:35:27.716 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:27 compute-1 podman[219670]: 2026-01-27 22:35:27.801060939 +0000 UTC m=+0.086896657 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 27 22:35:27 compute-1 podman[219669]: 2026-01-27 22:35:27.817430574 +0000 UTC m=+0.110924031 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:35:32 compute-1 nova_compute[183751]: 2026-01-27 22:35:32.717 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:35 compute-1 podman[193064]: time="2026-01-27T22:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:35:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:35:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2183 "" "Go-http-client/1.1"
Jan 27 22:35:36 compute-1 podman[219710]: 2026-01-27 22:35:36.776388515 +0000 UTC m=+0.080574651 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:35:37 compute-1 nova_compute[183751]: 2026-01-27 22:35:37.720 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:37 compute-1 nova_compute[183751]: 2026-01-27 22:35:37.721 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:37 compute-1 nova_compute[183751]: 2026-01-27 22:35:37.721 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:37 compute-1 nova_compute[183751]: 2026-01-27 22:35:37.721 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:37 compute-1 nova_compute[183751]: 2026-01-27 22:35:37.761 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:37 compute-1 nova_compute[183751]: 2026-01-27 22:35:37.762 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:42 compute-1 nova_compute[183751]: 2026-01-27 22:35:42.763 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:42 compute-1 nova_compute[183751]: 2026-01-27 22:35:42.765 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:42 compute-1 nova_compute[183751]: 2026-01-27 22:35:42.765 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:42 compute-1 nova_compute[183751]: 2026-01-27 22:35:42.765 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:42 compute-1 nova_compute[183751]: 2026-01-27 22:35:42.811 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:42 compute-1 nova_compute[183751]: 2026-01-27 22:35:42.812 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:47 compute-1 nova_compute[183751]: 2026-01-27 22:35:47.812 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:47 compute-1 nova_compute[183751]: 2026-01-27 22:35:47.864 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:47 compute-1 nova_compute[183751]: 2026-01-27 22:35:47.864 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5052 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:47 compute-1 nova_compute[183751]: 2026-01-27 22:35:47.864 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:47 compute-1 nova_compute[183751]: 2026-01-27 22:35:47.866 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:47 compute-1 nova_compute[183751]: 2026-01-27 22:35:47.867 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:49 compute-1 openstack_network_exporter[195945]: ERROR   22:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:35:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:35:49 compute-1 openstack_network_exporter[195945]: ERROR   22:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:35:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:35:52 compute-1 podman[219735]: 2026-01-27 22:35:52.833219789 +0000 UTC m=+0.130350821 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 27 22:35:52 compute-1 nova_compute[183751]: 2026-01-27 22:35:52.867 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:52 compute-1 nova_compute[183751]: 2026-01-27 22:35:52.869 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:35:52 compute-1 nova_compute[183751]: 2026-01-27 22:35:52.869 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:35:52 compute-1 nova_compute[183751]: 2026-01-27 22:35:52.869 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:52 compute-1 nova_compute[183751]: 2026-01-27 22:35:52.905 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:52 compute-1 nova_compute[183751]: 2026-01-27 22:35:52.906 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:35:57 compute-1 nova_compute[183751]: 2026-01-27 22:35:57.906 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:35:58 compute-1 podman[219762]: 2026-01-27 22:35:58.800390391 +0000 UTC m=+0.081240388 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Jan 27 22:35:58 compute-1 podman[219761]: 2026-01-27 22:35:58.813185917 +0000 UTC m=+0.097139051 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:36:02 compute-1 nova_compute[183751]: 2026-01-27 22:36:02.909 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:02 compute-1 nova_compute[183751]: 2026-01-27 22:36:02.911 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:02 compute-1 nova_compute[183751]: 2026-01-27 22:36:02.912 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:36:02 compute-1 nova_compute[183751]: 2026-01-27 22:36:02.912 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:02 compute-1 nova_compute[183751]: 2026-01-27 22:36:02.936 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:02 compute-1 nova_compute[183751]: 2026-01-27 22:36:02.937 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:36:04.808 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:36:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:36:04.810 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:36:04 compute-1 nova_compute[183751]: 2026-01-27 22:36:04.811 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:05 compute-1 podman[193064]: time="2026-01-27T22:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:36:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:36:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:36:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:36:05.811 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.876 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.879 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.911 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.912 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5844MB free_disk=73.13686752319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.913 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:36:05 compute-1 nova_compute[183751]: 2026-01-27 22:36:05.914 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.008 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.009 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:36:05 up  2:38,  0 user,  load average: 0.08, 0.14, 0.14\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.051 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.076 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.077 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.091 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.114 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.164 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.674 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:36:07 compute-1 podman[219801]: 2026-01-27 22:36:07.787942938 +0000 UTC m=+0.087962984 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:36:07 compute-1 nova_compute[183751]: 2026-01-27 22:36:07.938 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:08 compute-1 nova_compute[183751]: 2026-01-27 22:36:08.186 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:36:08 compute-1 nova_compute[183751]: 2026-01-27 22:36:08.186 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.272s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:36:10 compute-1 nova_compute[183751]: 2026-01-27 22:36:10.188 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:10 compute-1 nova_compute[183751]: 2026-01-27 22:36:10.189 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:10 compute-1 nova_compute[183751]: 2026-01-27 22:36:10.189 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:10 compute-1 nova_compute[183751]: 2026-01-27 22:36:10.190 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:36:11 compute-1 nova_compute[183751]: 2026-01-27 22:36:11.150 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:36:11.291 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:36:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:36:11.292 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:36:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:36:11.292 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:36:12 compute-1 nova_compute[183751]: 2026-01-27 22:36:12.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:12 compute-1 nova_compute[183751]: 2026-01-27 22:36:12.940 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:12 compute-1 nova_compute[183751]: 2026-01-27 22:36:12.941 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:12 compute-1 nova_compute[183751]: 2026-01-27 22:36:12.942 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:36:12 compute-1 nova_compute[183751]: 2026-01-27 22:36:12.942 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:12 compute-1 nova_compute[183751]: 2026-01-27 22:36:12.943 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:12 compute-1 nova_compute[183751]: 2026-01-27 22:36:12.944 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:13 compute-1 nova_compute[183751]: 2026-01-27 22:36:13.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:17 compute-1 nova_compute[183751]: 2026-01-27 22:36:17.944 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:19 compute-1 openstack_network_exporter[195945]: ERROR   22:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:36:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:36:19 compute-1 openstack_network_exporter[195945]: ERROR   22:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:36:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:36:22 compute-1 nova_compute[183751]: 2026-01-27 22:36:22.946 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:23 compute-1 podman[219827]: 2026-01-27 22:36:23.842359153 +0000 UTC m=+0.153774620 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:36:27 compute-1 nova_compute[183751]: 2026-01-27 22:36:27.948 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:27 compute-1 nova_compute[183751]: 2026-01-27 22:36:27.949 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:27 compute-1 nova_compute[183751]: 2026-01-27 22:36:27.949 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:36:27 compute-1 nova_compute[183751]: 2026-01-27 22:36:27.950 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:27 compute-1 nova_compute[183751]: 2026-01-27 22:36:27.950 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:27 compute-1 nova_compute[183751]: 2026-01-27 22:36:27.951 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:28 compute-1 sshd-session[219856]: Received disconnect from 91.224.92.190 port 15776:11:  [preauth]
Jan 27 22:36:28 compute-1 sshd-session[219856]: Disconnected from authenticating user root 91.224.92.190 port 15776 [preauth]
Jan 27 22:36:29 compute-1 podman[219858]: 2026-01-27 22:36:29.782499209 +0000 UTC m=+0.082408907 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 27 22:36:29 compute-1 podman[219859]: 2026-01-27 22:36:29.819100363 +0000 UTC m=+0.104908052 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:36:32 compute-1 nova_compute[183751]: 2026-01-27 22:36:32.952 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:32 compute-1 nova_compute[183751]: 2026-01-27 22:36:32.954 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:32 compute-1 nova_compute[183751]: 2026-01-27 22:36:32.955 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:36:32 compute-1 nova_compute[183751]: 2026-01-27 22:36:32.955 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:32 compute-1 nova_compute[183751]: 2026-01-27 22:36:32.984 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:32 compute-1 nova_compute[183751]: 2026-01-27 22:36:32.985 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:35 compute-1 nova_compute[183751]: 2026-01-27 22:36:35.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:36:35 compute-1 podman[193064]: time="2026-01-27T22:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:36:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:36:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Jan 27 22:36:36 compute-1 sshd-session[219900]: Invalid user user from 45.148.10.121 port 49852
Jan 27 22:36:36 compute-1 sshd-session[219900]: Connection closed by invalid user user 45.148.10.121 port 49852 [preauth]
Jan 27 22:36:37 compute-1 nova_compute[183751]: 2026-01-27 22:36:37.986 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:37 compute-1 nova_compute[183751]: 2026-01-27 22:36:37.987 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:37 compute-1 nova_compute[183751]: 2026-01-27 22:36:37.987 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:36:37 compute-1 nova_compute[183751]: 2026-01-27 22:36:37.987 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:37 compute-1 nova_compute[183751]: 2026-01-27 22:36:37.988 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:37 compute-1 nova_compute[183751]: 2026-01-27 22:36:37.988 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:38 compute-1 podman[219902]: 2026-01-27 22:36:38.798303974 +0000 UTC m=+0.096263789 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:36:42 compute-1 nova_compute[183751]: 2026-01-27 22:36:42.989 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:47 compute-1 nova_compute[183751]: 2026-01-27 22:36:47.991 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:49 compute-1 openstack_network_exporter[195945]: ERROR   22:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:36:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:36:49 compute-1 openstack_network_exporter[195945]: ERROR   22:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:36:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:36:52 compute-1 nova_compute[183751]: 2026-01-27 22:36:52.993 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:52 compute-1 nova_compute[183751]: 2026-01-27 22:36:52.995 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:36:52 compute-1 nova_compute[183751]: 2026-01-27 22:36:52.996 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:36:52 compute-1 nova_compute[183751]: 2026-01-27 22:36:52.996 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:53 compute-1 nova_compute[183751]: 2026-01-27 22:36:53.027 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:36:53 compute-1 nova_compute[183751]: 2026-01-27 22:36:53.028 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:36:54 compute-1 podman[219927]: 2026-01-27 22:36:54.831831013 +0000 UTC m=+0.136281857 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 27 22:36:58 compute-1 nova_compute[183751]: 2026-01-27 22:36:58.029 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:37:00 compute-1 podman[219955]: 2026-01-27 22:37:00.779435941 +0000 UTC m=+0.075136427 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 27 22:37:00 compute-1 podman[219954]: 2026-01-27 22:37:00.798548253 +0000 UTC m=+0.096377112 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 27 22:37:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:01.703 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:91:5d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0500b54ccc240539a97125971e92178', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d5030a57-5c97-4207-a276-f5ed6938ad09) old=Port_Binding(mac=['fa:16:3e:2d:91:5d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0500b54ccc240539a97125971e92178', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:37:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:01.705 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d5030a57-5c97-4207-a276-f5ed6938ad09 in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 updated
Jan 27 22:37:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:01.706 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24d39604-44db-4002-b4d8-7ef0b15b5533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:37:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:01.709 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[de14cc3c-55c9-4d66-8f08-74d7933017cf]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:03 compute-1 nova_compute[183751]: 2026-01-27 22:37:03.031 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:37:03 compute-1 nova_compute[183751]: 2026-01-27 22:37:03.033 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:37:03 compute-1 nova_compute[183751]: 2026-01-27 22:37:03.033 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Jan 27 22:37:03 compute-1 nova_compute[183751]: 2026-01-27 22:37:03.033 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:37:03 compute-1 nova_compute[183751]: 2026-01-27 22:37:03.069 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:03 compute-1 nova_compute[183751]: 2026-01-27 22:37:03.070 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Jan 27 22:37:05 compute-1 nova_compute[183751]: 2026-01-27 22:37:05.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:05 compute-1 nova_compute[183751]: 2026-01-27 22:37:05.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:37:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:05.259 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:37:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:05.297 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:37:05 compute-1 nova_compute[183751]: 2026-01-27 22:37:05.299 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:05 compute-1 podman[193064]: time="2026-01-27T22:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:37:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:37:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2184 "" "Go-http-client/1.1"
Jan 27 22:37:05 compute-1 nova_compute[183751]: 2026-01-27 22:37:05.653 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:37:06 compute-1 nova_compute[183751]: 2026-01-27 22:37:06.655 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.705 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.706 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.706 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.707 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.930 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.932 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.958 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.959 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5867MB free_disk=73.13686752319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.960 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:07 compute-1 nova_compute[183751]: 2026-01-27 22:37:07.960 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:08 compute-1 nova_compute[183751]: 2026-01-27 22:37:08.070 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:09 compute-1 nova_compute[183751]: 2026-01-27 22:37:09.025 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:37:09 compute-1 nova_compute[183751]: 2026-01-27 22:37:09.026 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:37:07 up  2:39,  0 user,  load average: 0.03, 0.11, 0.13\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:37:09 compute-1 nova_compute[183751]: 2026-01-27 22:37:09.055 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:37:09 compute-1 nova_compute[183751]: 2026-01-27 22:37:09.563 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:37:09 compute-1 podman[219996]: 2026-01-27 22:37:09.793345069 +0000 UTC m=+0.095960671 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:37:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:09.890 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:da:df 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7c67d2bb-6ba1-46ee-94e3-ed5603cfd1cd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c67d2bb-6ba1-46ee-94e3-ed5603cfd1cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4201b15-486e-49bf-b7fb-c9523be7e256, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=df55b007-9188-4c78-a897-a82944323fa1) old=Port_Binding(mac=['fa:16:3e:75:da:df'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7c67d2bb-6ba1-46ee-94e3-ed5603cfd1cd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c67d2bb-6ba1-46ee-94e3-ed5603cfd1cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:37:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:09.891 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port df55b007-9188-4c78-a897-a82944323fa1 in datapath 7c67d2bb-6ba1-46ee-94e3-ed5603cfd1cd updated
Jan 27 22:37:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:09.892 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7c67d2bb-6ba1-46ee-94e3-ed5603cfd1cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:37:09 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:09.893 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c8c3ebc5-d3b0-4040-b416-49a4a8364042]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:10 compute-1 nova_compute[183751]: 2026-01-27 22:37:10.075 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:37:10 compute-1 nova_compute[183751]: 2026-01-27 22:37:10.075 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:10 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:10.299 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:11 compute-1 nova_compute[183751]: 2026-01-27 22:37:11.076 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:11 compute-1 nova_compute[183751]: 2026-01-27 22:37:11.076 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:11 compute-1 nova_compute[183751]: 2026-01-27 22:37:11.077 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:11 compute-1 nova_compute[183751]: 2026-01-27 22:37:11.077 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:37:11 compute-1 nova_compute[183751]: 2026-01-27 22:37:11.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:11.293 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:11.293 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:11.294 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:13 compute-1 nova_compute[183751]: 2026-01-27 22:37:13.072 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:37:13 compute-1 nova_compute[183751]: 2026-01-27 22:37:13.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:15 compute-1 nova_compute[183751]: 2026-01-27 22:37:15.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:18 compute-1 nova_compute[183751]: 2026-01-27 22:37:18.077 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:37:19 compute-1 openstack_network_exporter[195945]: ERROR   22:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:37:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:37:19 compute-1 openstack_network_exporter[195945]: ERROR   22:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:37:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:37:23 compute-1 nova_compute[183751]: 2026-01-27 22:37:23.079 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:23 compute-1 nova_compute[183751]: 2026-01-27 22:37:23.082 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:23 compute-1 nova_compute[183751]: 2026-01-27 22:37:23.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:23 compute-1 nova_compute[183751]: 2026-01-27 22:37:23.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:37:27 compute-1 podman[220022]: 2026-01-27 22:37:27.342830931 +0000 UTC m=+0.147825251 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:37:28 compute-1 nova_compute[183751]: 2026-01-27 22:37:28.080 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:28 compute-1 nova_compute[183751]: 2026-01-27 22:37:28.083 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:31 compute-1 podman[220052]: 2026-01-27 22:37:31.785641631 +0000 UTC m=+0.083229917 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:37:31 compute-1 podman[220051]: 2026-01-27 22:37:31.786607985 +0000 UTC m=+0.090953868 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 27 22:37:33 compute-1 nova_compute[183751]: 2026-01-27 22:37:33.081 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:33 compute-1 nova_compute[183751]: 2026-01-27 22:37:33.085 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:35 compute-1 podman[193064]: time="2026-01-27T22:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:37:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:37:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2185 "" "Go-http-client/1.1"
Jan 27 22:37:38 compute-1 nova_compute[183751]: 2026-01-27 22:37:38.083 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:38 compute-1 nova_compute[183751]: 2026-01-27 22:37:38.087 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:40 compute-1 podman[220092]: 2026-01-27 22:37:40.746009036 +0000 UTC m=+0.057772908 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:37:40 compute-1 nova_compute[183751]: 2026-01-27 22:37:40.819 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:40 compute-1 nova_compute[183751]: 2026-01-27 22:37:40.819 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:41 compute-1 nova_compute[183751]: 2026-01-27 22:37:41.328 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:37:41 compute-1 nova_compute[183751]: 2026-01-27 22:37:41.873 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:41 compute-1 nova_compute[183751]: 2026-01-27 22:37:41.874 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:41 compute-1 nova_compute[183751]: 2026-01-27 22:37:41.883 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:37:41 compute-1 nova_compute[183751]: 2026-01-27 22:37:41.883 183755 INFO nova.compute.claims [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:37:42 compute-1 nova_compute[183751]: 2026-01-27 22:37:42.942 183755 DEBUG nova.compute.provider_tree [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:37:43 compute-1 nova_compute[183751]: 2026-01-27 22:37:43.087 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:43 compute-1 nova_compute[183751]: 2026-01-27 22:37:43.451 183755 DEBUG nova.scheduler.client.report [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:37:43 compute-1 nova_compute[183751]: 2026-01-27 22:37:43.965 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:43 compute-1 nova_compute[183751]: 2026-01-27 22:37:43.966 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:37:44 compute-1 nova_compute[183751]: 2026-01-27 22:37:44.479 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:37:44 compute-1 nova_compute[183751]: 2026-01-27 22:37:44.480 183755 DEBUG nova.network.neutron [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:37:44 compute-1 nova_compute[183751]: 2026-01-27 22:37:44.480 183755 WARNING neutronclient.v2_0.client [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:37:44 compute-1 nova_compute[183751]: 2026-01-27 22:37:44.481 183755 WARNING neutronclient.v2_0.client [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:37:44 compute-1 nova_compute[183751]: 2026-01-27 22:37:44.989 183755 INFO nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:37:45 compute-1 nova_compute[183751]: 2026-01-27 22:37:45.501 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.537 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.539 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.539 183755 INFO nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Creating image(s)
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.540 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "/var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.541 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "/var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.542 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "/var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.543 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.549 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.553 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.636 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.637 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.638 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.638 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.641 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.641 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.695 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.696 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.733 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.735 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.735 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.820 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.822 183755 DEBUG nova.virt.disk.api [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Checking if we can resize image /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.823 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.887 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.889 183755 DEBUG nova.virt.disk.api [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Cannot resize image /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.890 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.890 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Ensure instance console log exists: /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.891 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.892 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.893 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:46 compute-1 nova_compute[183751]: 2026-01-27 22:37:46.942 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:46 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:46.943 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:37:46 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:46.944 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:37:47 compute-1 nova_compute[183751]: 2026-01-27 22:37:47.090 183755 DEBUG nova.network.neutron [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Successfully created port: a745dd2e-98f7-4d01-9419-3b72fc35eecb _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:37:47 compute-1 nova_compute[183751]: 2026-01-27 22:37:47.630 183755 DEBUG nova.network.neutron [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Successfully updated port: a745dd2e-98f7-4d01-9419-3b72fc35eecb _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:37:47 compute-1 nova_compute[183751]: 2026-01-27 22:37:47.689 183755 DEBUG nova.compute.manager [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-changed-a745dd2e-98f7-4d01-9419-3b72fc35eecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:37:47 compute-1 nova_compute[183751]: 2026-01-27 22:37:47.689 183755 DEBUG nova.compute.manager [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Refreshing instance network info cache due to event network-changed-a745dd2e-98f7-4d01-9419-3b72fc35eecb. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:37:47 compute-1 nova_compute[183751]: 2026-01-27 22:37:47.690 183755 DEBUG oslo_concurrency.lockutils [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-4ace501f-e487-4354-a77c-b4c3dde921eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:37:47 compute-1 nova_compute[183751]: 2026-01-27 22:37:47.690 183755 DEBUG oslo_concurrency.lockutils [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-4ace501f-e487-4354-a77c-b4c3dde921eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:37:47 compute-1 nova_compute[183751]: 2026-01-27 22:37:47.690 183755 DEBUG nova.network.neutron [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Refreshing network info cache for port a745dd2e-98f7-4d01-9419-3b72fc35eecb _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:37:48 compute-1 nova_compute[183751]: 2026-01-27 22:37:48.088 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:48 compute-1 nova_compute[183751]: 2026-01-27 22:37:48.089 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:48 compute-1 nova_compute[183751]: 2026-01-27 22:37:48.138 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "refresh_cache-4ace501f-e487-4354-a77c-b4c3dde921eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:37:48 compute-1 nova_compute[183751]: 2026-01-27 22:37:48.195 183755 WARNING neutronclient.v2_0.client [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:37:48 compute-1 nova_compute[183751]: 2026-01-27 22:37:48.301 183755 DEBUG nova.network.neutron [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:37:48 compute-1 nova_compute[183751]: 2026-01-27 22:37:48.500 183755 DEBUG nova.network.neutron [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:37:49 compute-1 nova_compute[183751]: 2026-01-27 22:37:49.008 183755 DEBUG oslo_concurrency.lockutils [req-b340938a-8bf0-485f-82cc-bf5f1256f5a6 req-8bc8877a-49be-4849-bc19-164f7503c740 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-4ace501f-e487-4354-a77c-b4c3dde921eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:37:49 compute-1 nova_compute[183751]: 2026-01-27 22:37:49.009 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquired lock "refresh_cache-4ace501f-e487-4354-a77c-b4c3dde921eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:37:49 compute-1 nova_compute[183751]: 2026-01-27 22:37:49.009 183755 DEBUG nova.network.neutron [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:37:49 compute-1 nova_compute[183751]: 2026-01-27 22:37:49.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:37:49 compute-1 openstack_network_exporter[195945]: ERROR   22:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:37:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:37:49 compute-1 openstack_network_exporter[195945]: ERROR   22:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:37:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:37:49 compute-1 nova_compute[183751]: 2026-01-27 22:37:49.661 183755 DEBUG nova.network.neutron [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:37:49 compute-1 nova_compute[183751]: 2026-01-27 22:37:49.984 183755 WARNING neutronclient.v2_0.client [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.244 183755 DEBUG nova.network.neutron [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Updating instance_info_cache with network_info: [{"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.752 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Releasing lock "refresh_cache-4ace501f-e487-4354-a77c-b4c3dde921eb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.753 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Instance network_info: |[{"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.757 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Start _get_guest_xml network_info=[{"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.762 183755 WARNING nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.764 183755 DEBUG nova.virt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-589587884', uuid='4ace501f-e487-4354-a77c-b4c3dde921eb'), owner=OwnerMeta(userid='84404785aedd471590f8ac69cbbb69db', username='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin', projectid='f85db165fdfb4bf4a093051065554230', projectname='tempest-TestExecuteZoneMigrationStrategy-1358952714'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769553470.764184) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.770 183755 DEBUG nova.virt.libvirt.host [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.771 183755 DEBUG nova.virt.libvirt.host [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.774 183755 DEBUG nova.virt.libvirt.host [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.775 183755 DEBUG nova.virt.libvirt.host [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.777 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.777 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.777 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.778 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.778 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.778 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.778 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.778 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.779 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.779 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.779 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.779 183755 DEBUG nova.virt.hardware [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.783 183755 DEBUG nova.virt.libvirt.vif [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:37:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-589587884',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-589587884',id=24,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f85db165fdfb4bf4a093051065554230',ramdisk_id='',reservation_id='r-oyo4o1n9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1358952714',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:37:45Z,user_data=None,user_id='84404785aedd471590f8ac69cbbb69db',uuid=4ace501f-e487-4354-a77c-b4c3dde921eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.784 183755 DEBUG nova.network.os_vif_util [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converting VIF {"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.784 183755 DEBUG nova.network.os_vif_util [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:35:aa,bridge_name='br-int',has_traffic_filtering=True,id=a745dd2e-98f7-4d01-9419-3b72fc35eecb,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745dd2e-98') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:37:50 compute-1 nova_compute[183751]: 2026-01-27 22:37:50.785 183755 DEBUG nova.objects.instance [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4ace501f-e487-4354-a77c-b4c3dde921eb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.294 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <uuid>4ace501f-e487-4354-a77c-b4c3dde921eb</uuid>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <name>instance-00000018</name>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-589587884</nova:name>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:37:50</nova:creationTime>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:37:51 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:37:51 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:user uuid="84404785aedd471590f8ac69cbbb69db">tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin</nova:user>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:project uuid="f85db165fdfb4bf4a093051065554230">tempest-TestExecuteZoneMigrationStrategy-1358952714</nova:project>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         <nova:port uuid="a745dd2e-98f7-4d01-9419-3b72fc35eecb">
Jan 27 22:37:51 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <system>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <entry name="serial">4ace501f-e487-4354-a77c-b4c3dde921eb</entry>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <entry name="uuid">4ace501f-e487-4354-a77c-b4c3dde921eb</entry>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </system>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <os>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   </os>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <features>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   </features>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk.config"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:2d:35:aa"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <target dev="tapa745dd2e-98"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/console.log" append="off"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <video>
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </video>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:37:51 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:37:51 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:37:51 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:37:51 compute-1 nova_compute[183751]: </domain>
Jan 27 22:37:51 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.296 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Preparing to wait for external event network-vif-plugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.297 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.297 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.297 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.298 183755 DEBUG nova.virt.libvirt.vif [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:37:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-589587884',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-589587884',id=24,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f85db165fdfb4bf4a093051065554230',ramdisk_id='',reservation_id='r-oyo4o1n9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1358952714',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:37:45Z,user_data=None,user_id='84404785aedd471590f8ac69cbbb69db',uuid=4ace501f-e487-4354-a77c-b4c3dde921eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.299 183755 DEBUG nova.network.os_vif_util [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converting VIF {"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.300 183755 DEBUG nova.network.os_vif_util [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:35:aa,bridge_name='br-int',has_traffic_filtering=True,id=a745dd2e-98f7-4d01-9419-3b72fc35eecb,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745dd2e-98') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.300 183755 DEBUG os_vif [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:35:aa,bridge_name='br-int',has_traffic_filtering=True,id=a745dd2e-98f7-4d01-9419-3b72fc35eecb,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745dd2e-98') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.301 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.301 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.302 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.303 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.303 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4ebf5ec5-7311-57ac-a78f-58a6cf6d3bfb', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.355 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.356 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.362 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.363 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa745dd2e-98, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.363 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa745dd2e-98, col_values=(('qos', UUID('57bb519a-187b-4718-87da-dd174cfea392')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.364 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa745dd2e-98, col_values=(('external_ids', {'iface-id': 'a745dd2e-98f7-4d01-9419-3b72fc35eecb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:35:aa', 'vm-uuid': '4ace501f-e487-4354-a77c-b4c3dde921eb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.366 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:51 compute-1 NetworkManager[56069]: <info>  [1769553471.3670] manager: (tapa745dd2e-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.368 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.373 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:51 compute-1 nova_compute[183751]: 2026-01-27 22:37:51.374 183755 INFO os_vif [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:35:aa,bridge_name='br-int',has_traffic_filtering=True,id=a745dd2e-98f7-4d01-9419-3b72fc35eecb,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745dd2e-98')
Jan 27 22:37:51 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:51.945 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:52 compute-1 nova_compute[183751]: 2026-01-27 22:37:52.928 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:37:52 compute-1 nova_compute[183751]: 2026-01-27 22:37:52.929 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:37:52 compute-1 nova_compute[183751]: 2026-01-27 22:37:52.929 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] No VIF found with MAC fa:16:3e:2d:35:aa, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:37:52 compute-1 nova_compute[183751]: 2026-01-27 22:37:52.929 183755 INFO nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Using config drive
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.091 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.440 183755 WARNING neutronclient.v2_0.client [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.617 183755 INFO nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Creating config drive at /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk.config
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.623 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjh5_q4su execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.759 183755 DEBUG oslo_concurrency.processutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpjh5_q4su" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:37:53 compute-1 kernel: tapa745dd2e-98: entered promiscuous mode
Jan 27 22:37:53 compute-1 NetworkManager[56069]: <info>  [1769553473.8368] manager: (tapa745dd2e-98): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Jan 27 22:37:53 compute-1 ovn_controller[95969]: 2026-01-27T22:37:53Z|00101|binding|INFO|Claiming lport a745dd2e-98f7-4d01-9419-3b72fc35eecb for this chassis.
Jan 27 22:37:53 compute-1 ovn_controller[95969]: 2026-01-27T22:37:53Z|00102|binding|INFO|a745dd2e-98f7-4d01-9419-3b72fc35eecb: Claiming fa:16:3e:2d:35:aa 10.100.0.11
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.839 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.845 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.857 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:35:aa 10.100.0.11'], port_security=['fa:16:3e:2d:35:aa 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ace501f-e487-4354-a77c-b4c3dde921eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=a745dd2e-98f7-4d01-9419-3b72fc35eecb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.858 105247 INFO neutron.agent.ovn.metadata.agent [-] Port a745dd2e-98f7-4d01-9419-3b72fc35eecb in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 bound to our chassis
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.859 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24d39604-44db-4002-b4d8-7ef0b15b5533
Jan 27 22:37:53 compute-1 systemd-udevd[220152]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.870 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea420a7-4442-480d-a7b1-fd2c7d191126]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.871 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24d39604-41 in ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.873 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24d39604-40 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.873 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[852c314c-63f0-42a7-b775-74d5abac1d1c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.874 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6a2d8562-0b38-4f3b-abef-ec59988716d2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:53 compute-1 systemd-machined[155034]: New machine qemu-8-instance-00000018.
Jan 27 22:37:53 compute-1 NetworkManager[56069]: <info>  [1769553473.8957] device (tapa745dd2e-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.893 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1f72e8-298f-4ea3-b0ee-62e48e4c8292]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:53 compute-1 NetworkManager[56069]: <info>  [1769553473.8975] device (tapa745dd2e-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.926 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.926 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[43e0e1ae-4fb8-4e5b-acb6-6687dd2bf3dd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.935 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:53 compute-1 ovn_controller[95969]: 2026-01-27T22:37:53Z|00103|binding|INFO|Setting lport a745dd2e-98f7-4d01-9419-3b72fc35eecb ovn-installed in OVS
Jan 27 22:37:53 compute-1 ovn_controller[95969]: 2026-01-27T22:37:53Z|00104|binding|INFO|Setting lport a745dd2e-98f7-4d01-9419-3b72fc35eecb up in Southbound
Jan 27 22:37:53 compute-1 systemd[1]: Started Virtual Machine qemu-8-instance-00000018.
Jan 27 22:37:53 compute-1 nova_compute[183751]: 2026-01-27 22:37:53.937 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.963 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[e104b09d-e616-440b-9a12-45eecdf52ede]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:53 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:53.970 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ec380630-4afd-43fd-aada-4ec1d078efbe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:53 compute-1 NetworkManager[56069]: <info>  [1769553473.9711] manager: (tap24d39604-40): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.020 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[da96188c-5ad8-4160-8493-477fe3b6fa66]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.025 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[21851cee-cf2b-4705-8785-28e48adf5c9e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 NetworkManager[56069]: <info>  [1769553474.0549] device (tap24d39604-40): carrier: link connected
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.062 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[7443194c-6e2b-4942-996e-390f6a4cbd29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.083 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[65f11d9c-98bf-42cb-9484-1b3f8a650a8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24d39604-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:91:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 961632, 'reachable_time': 15493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220185, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.103 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[cde1f11e-b1bb-4442-93f5-2710fa7e4d96]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:915d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 961632, 'tstamp': 961632}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220186, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.125 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[343d77ee-3fe7-4556-acf1-3e1108830c78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24d39604-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:91:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 961632, 'reachable_time': 15493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220187, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.171 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3f40c82b-ac06-4dc9-a645-830a68bd3d68]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.266 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ec025aee-c607-4be1-bd5d-17c74442d5ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.268 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24d39604-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.268 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.268 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24d39604-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:54 compute-1 kernel: tap24d39604-40: entered promiscuous mode
Jan 27 22:37:54 compute-1 NetworkManager[56069]: <info>  [1769553474.2724] manager: (tap24d39604-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.271 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.282 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24d39604-40, col_values=(('external_ids', {'iface-id': 'd5030a57-5c97-4207-a276-f5ed6938ad09'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.280 183755 DEBUG nova.compute.manager [req-7929acdf-f149-47a1-b3f9-04e1bfa9eecd req-61c7dc0f-1a1c-42d6-a190-0f98dfdfc553 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-vif-plugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.286 183755 DEBUG oslo_concurrency.lockutils [req-7929acdf-f149-47a1-b3f9-04e1bfa9eecd req-61c7dc0f-1a1c-42d6-a190-0f98dfdfc553 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.287 183755 DEBUG oslo_concurrency.lockutils [req-7929acdf-f149-47a1-b3f9-04e1bfa9eecd req-61c7dc0f-1a1c-42d6-a190-0f98dfdfc553 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.287 183755 DEBUG oslo_concurrency.lockutils [req-7929acdf-f149-47a1-b3f9-04e1bfa9eecd req-61c7dc0f-1a1c-42d6-a190-0f98dfdfc553 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.287 183755 DEBUG nova.compute.manager [req-7929acdf-f149-47a1-b3f9-04e1bfa9eecd req-61c7dc0f-1a1c-42d6-a190-0f98dfdfc553 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Processing event network-vif-plugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:37:54 compute-1 ovn_controller[95969]: 2026-01-27T22:37:54Z|00105|binding|INFO|Releasing lport d5030a57-5c97-4207-a276-f5ed6938ad09 from this chassis (sb_readonly=0)
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.287 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.290 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd4c337-7385-4658-86f7-282a7b283604]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.290 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.291 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.291 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 24d39604-44db-4002-b4d8-7ef0b15b5533 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.291 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.291 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[8481028f-a44c-43c1-8314-b08292ff24ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.291 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.292 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[53d38301-5cf6-4ddd-8277-fa926b51e4d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.292 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-24d39604-44db-4002-b4d8-7ef0b15b5533
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID 24d39604-44db-4002-b4d8-7ef0b15b5533
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:37:54 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:37:54.293 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'env', 'PROCESS_TAG=haproxy-24d39604-44db-4002-b4d8-7ef0b15b5533', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24d39604-44db-4002-b4d8-7ef0b15b5533.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.299 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.453 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.460 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.464 183755 INFO nova.virt.libvirt.driver [-] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Instance spawned successfully.
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.464 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:37:54 compute-1 podman[220227]: 2026-01-27 22:37:54.796314749 +0000 UTC m=+0.071468866 container create 294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, tcib_managed=true, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:37:54 compute-1 systemd[1]: Started libpod-conmon-294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527.scope.
Jan 27 22:37:54 compute-1 podman[220227]: 2026-01-27 22:37:54.764598996 +0000 UTC m=+0.039753113 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:37:54 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:37:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/753163ed6e1eb822d021fafee8f697dc7c03c7ae566df789ae0b1c0b745cdde2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:37:54 compute-1 podman[220227]: 2026-01-27 22:37:54.885568014 +0000 UTC m=+0.160722201 container init 294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Jan 27 22:37:54 compute-1 podman[220227]: 2026-01-27 22:37:54.896291669 +0000 UTC m=+0.171445796 container start 294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260126)
Jan 27 22:37:54 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220240]: [NOTICE]   (220244) : New worker (220246) forked
Jan 27 22:37:54 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220240]: [NOTICE]   (220244) : Loading success.
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.981 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.981 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.982 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.984 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.985 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:37:54 compute-1 nova_compute[183751]: 2026-01-27 22:37:54.985 183755 DEBUG nova.virt.libvirt.driver [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:37:55 compute-1 nova_compute[183751]: 2026-01-27 22:37:55.501 183755 INFO nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Took 8.96 seconds to spawn the instance on the hypervisor.
Jan 27 22:37:55 compute-1 nova_compute[183751]: 2026-01-27 22:37:55.502 183755 DEBUG nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.049 183755 INFO nova.compute.manager [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Took 14.21 seconds to build instance.
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.364 183755 DEBUG nova.compute.manager [req-0045ac02-2707-475f-a44a-0c75b7376b96 req-e1e39a4d-cf4f-4b86-8b76-73d8b35fa3f2 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-vif-plugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.364 183755 DEBUG oslo_concurrency.lockutils [req-0045ac02-2707-475f-a44a-0c75b7376b96 req-e1e39a4d-cf4f-4b86-8b76-73d8b35fa3f2 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.365 183755 DEBUG oslo_concurrency.lockutils [req-0045ac02-2707-475f-a44a-0c75b7376b96 req-e1e39a4d-cf4f-4b86-8b76-73d8b35fa3f2 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.365 183755 DEBUG oslo_concurrency.lockutils [req-0045ac02-2707-475f-a44a-0c75b7376b96 req-e1e39a4d-cf4f-4b86-8b76-73d8b35fa3f2 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.365 183755 DEBUG nova.compute.manager [req-0045ac02-2707-475f-a44a-0c75b7376b96 req-e1e39a4d-cf4f-4b86-8b76-73d8b35fa3f2 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] No waiting events found dispatching network-vif-plugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.365 183755 WARNING nova.compute.manager [req-0045ac02-2707-475f-a44a-0c75b7376b96 req-e1e39a4d-cf4f-4b86-8b76-73d8b35fa3f2 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received unexpected event network-vif-plugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb for instance with vm_state active and task_state None.
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.366 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:37:56 compute-1 nova_compute[183751]: 2026-01-27 22:37:56.555 183755 DEBUG oslo_concurrency.lockutils [None req-01c74691-be47-436f-bad3-159633be15fe 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.736s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:37:57 compute-1 podman[220255]: 2026-01-27 22:37:57.858009545 +0000 UTC m=+0.158424365 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 27 22:37:58 compute-1 nova_compute[183751]: 2026-01-27 22:37:58.094 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.082 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.083 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.083 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.084 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.084 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.101 183755 INFO nova.compute.manager [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Terminating instance
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.403 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.623 183755 DEBUG nova.compute.manager [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:38:01 compute-1 kernel: tapa745dd2e-98 (unregistering): left promiscuous mode
Jan 27 22:38:01 compute-1 NetworkManager[56069]: <info>  [1769553481.6431] device (tapa745dd2e-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:38:01 compute-1 ovn_controller[95969]: 2026-01-27T22:38:01Z|00106|binding|INFO|Releasing lport a745dd2e-98f7-4d01-9419-3b72fc35eecb from this chassis (sb_readonly=0)
Jan 27 22:38:01 compute-1 ovn_controller[95969]: 2026-01-27T22:38:01Z|00107|binding|INFO|Setting lport a745dd2e-98f7-4d01-9419-3b72fc35eecb down in Southbound
Jan 27 22:38:01 compute-1 ovn_controller[95969]: 2026-01-27T22:38:01Z|00108|binding|INFO|Removing iface tapa745dd2e-98 ovn-installed in OVS
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.658 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.666 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:35:aa 10.100.0.11'], port_security=['fa:16:3e:2d:35:aa 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ace501f-e487-4354-a77c-b4c3dde921eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '5', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=a745dd2e-98f7-4d01-9419-3b72fc35eecb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.668 105247 INFO neutron.agent.ovn.metadata.agent [-] Port a745dd2e-98f7-4d01-9419-3b72fc35eecb in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 unbound from our chassis
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.669 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24d39604-44db-4002-b4d8-7ef0b15b5533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.670 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[81161e9a-1d4b-44a0-ab8b-e58b083a3b63]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.670 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 namespace which is not needed anymore
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.687 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:01 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 27 22:38:01 compute-1 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000018.scope: Consumed 7.871s CPU time.
Jan 27 22:38:01 compute-1 systemd-machined[155034]: Machine qemu-8-instance-00000018 terminated.
Jan 27 22:38:01 compute-1 podman[220305]: 2026-01-27 22:38:01.840092504 +0000 UTC m=+0.041472825 container kill 294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0)
Jan 27 22:38:01 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220240]: [NOTICE]   (220244) : haproxy version is 3.0.5-8e879a5
Jan 27 22:38:01 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220240]: [NOTICE]   (220244) : path to executable is /usr/sbin/haproxy
Jan 27 22:38:01 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220240]: [WARNING]  (220244) : Exiting Master process...
Jan 27 22:38:01 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220240]: [ALERT]    (220244) : Current worker (220246) exited with code 143 (Terminated)
Jan 27 22:38:01 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220240]: [WARNING]  (220244) : All workers exited. Exiting... (0)
Jan 27 22:38:01 compute-1 systemd[1]: libpod-294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527.scope: Deactivated successfully.
Jan 27 22:38:01 compute-1 kernel: tapa745dd2e-98: entered promiscuous mode
Jan 27 22:38:01 compute-1 kernel: tapa745dd2e-98 (unregistering): left promiscuous mode
Jan 27 22:38:01 compute-1 ovn_controller[95969]: 2026-01-27T22:38:01Z|00109|binding|INFO|Claiming lport a745dd2e-98f7-4d01-9419-3b72fc35eecb for this chassis.
Jan 27 22:38:01 compute-1 ovn_controller[95969]: 2026-01-27T22:38:01Z|00110|binding|INFO|a745dd2e-98f7-4d01-9419-3b72fc35eecb: Claiming fa:16:3e:2d:35:aa 10.100.0.11
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.856 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.863 183755 DEBUG nova.compute.manager [req-563a0ce2-aa22-41fb-8124-65dfdb60e8b3 req-c57a3ce0-ca26-4d93-b347-45d1b4ba41fb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-vif-unplugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.863 183755 DEBUG oslo_concurrency.lockutils [req-563a0ce2-aa22-41fb-8124-65dfdb60e8b3 req-c57a3ce0-ca26-4d93-b347-45d1b4ba41fb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.864 183755 DEBUG oslo_concurrency.lockutils [req-563a0ce2-aa22-41fb-8124-65dfdb60e8b3 req-c57a3ce0-ca26-4d93-b347-45d1b4ba41fb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.864 183755 DEBUG oslo_concurrency.lockutils [req-563a0ce2-aa22-41fb-8124-65dfdb60e8b3 req-c57a3ce0-ca26-4d93-b347-45d1b4ba41fb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.864 183755 DEBUG nova.compute.manager [req-563a0ce2-aa22-41fb-8124-65dfdb60e8b3 req-c57a3ce0-ca26-4d93-b347-45d1b4ba41fb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] No waiting events found dispatching network-vif-unplugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.865 183755 DEBUG nova.compute.manager [req-563a0ce2-aa22-41fb-8124-65dfdb60e8b3 req-c57a3ce0-ca26-4d93-b347-45d1b4ba41fb 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-vif-unplugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.868 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:35:aa 10.100.0.11'], port_security=['fa:16:3e:2d:35:aa 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ace501f-e487-4354-a77c-b4c3dde921eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '5', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=a745dd2e-98f7-4d01-9419-3b72fc35eecb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.882 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:01 compute-1 ovn_controller[95969]: 2026-01-27T22:38:01Z|00111|binding|INFO|Releasing lport a745dd2e-98f7-4d01-9419-3b72fc35eecb from this chassis (sb_readonly=0)
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.902 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:35:aa 10.100.0.11'], port_security=['fa:16:3e:2d:35:aa 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ace501f-e487-4354-a77c-b4c3dde921eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '5', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=a745dd2e-98f7-4d01-9419-3b72fc35eecb) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:38:01 compute-1 podman[220320]: 2026-01-27 22:38:01.903148632 +0000 UTC m=+0.040749278 container died 294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126)
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.907 183755 INFO nova.virt.libvirt.driver [-] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Instance destroyed successfully.
Jan 27 22:38:01 compute-1 nova_compute[183751]: 2026-01-27 22:38:01.908 183755 DEBUG nova.objects.instance [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lazy-loading 'resources' on Instance uuid 4ace501f-e487-4354-a77c-b4c3dde921eb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:38:01 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527-userdata-shm.mount: Deactivated successfully.
Jan 27 22:38:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-753163ed6e1eb822d021fafee8f697dc7c03c7ae566df789ae0b1c0b745cdde2-merged.mount: Deactivated successfully.
Jan 27 22:38:01 compute-1 podman[220320]: 2026-01-27 22:38:01.970819774 +0000 UTC m=+0.108420400 container cleanup 294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Jan 27 22:38:01 compute-1 systemd[1]: libpod-conmon-294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527.scope: Deactivated successfully.
Jan 27 22:38:01 compute-1 podman[220321]: 2026-01-27 22:38:01.985765513 +0000 UTC m=+0.106162524 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Jan 27 22:38:01 compute-1 podman[220326]: 2026-01-27 22:38:01.991610857 +0000 UTC m=+0.111891605 container remove 294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Jan 27 22:38:01 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:01.998 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[75cc918e-3658-489e-ab36-b846a170cc61]: (4, ("Tue Jan 27 10:38:01 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 (294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527)\n294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527\nTue Jan 27 10:38:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 (294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527)\n294244434c39cb5fbc347ba68ae5beea0734c6050ae5117b82d209578981c527\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.000 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5034d8-f434-46ca-8e1f-5a2744ab4f5c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.000 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.001 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2d8713-0d0b-4e29-9952-048823ab4fac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.002 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24d39604-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.003 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 kernel: tap24d39604-40: left promiscuous mode
Jan 27 22:38:02 compute-1 podman[220331]: 2026-01-27 22:38:02.00712158 +0000 UTC m=+0.124088146 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.026 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.032 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c2436887-7571-4e73-85d8-3f84913ec543]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.054 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[21b84976-e1a7-468f-924a-08e4ef104ad8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.055 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[02c43a79-cd29-44b7-941b-1620234e59fa]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.072 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5f429687-7e8b-4722-be22-691340999d8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 961622, 'reachable_time': 18548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220404, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.077 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.077 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[e8502033-2ac4-46ca-86bf-fef2e04c373a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.078 105247 INFO neutron.agent.ovn.metadata.agent [-] Port a745dd2e-98f7-4d01-9419-3b72fc35eecb in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 unbound from our chassis
Jan 27 22:38:02 compute-1 systemd[1]: run-netns-ovnmeta\x2d24d39604\x2d44db\x2d4002\x2db4d8\x2d7ef0b15b5533.mount: Deactivated successfully.
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.078 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24d39604-44db-4002-b4d8-7ef0b15b5533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.079 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ac78e3-afa3-41ba-a9aa-7fde014e0eab]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.080 105247 INFO neutron.agent.ovn.metadata.agent [-] Port a745dd2e-98f7-4d01-9419-3b72fc35eecb in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 unbound from our chassis
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.081 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24d39604-44db-4002-b4d8-7ef0b15b5533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:38:02 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:02.081 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f36e484c-6929-4948-8db9-994bb84b36ea]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.414 183755 DEBUG nova.virt.libvirt.vif [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:37:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-589587884',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-589587884',id=24,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:37:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f85db165fdfb4bf4a093051065554230',ramdisk_id='',reservation_id='r-oyo4o1n9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1358952714',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:37:55Z,user_data=None,user_id='84404785aedd471590f8ac69cbbb69db',uuid=4ace501f-e487-4354-a77c-b4c3dde921eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.415 183755 DEBUG nova.network.os_vif_util [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converting VIF {"id": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "address": "fa:16:3e:2d:35:aa", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa745dd2e-98", "ovs_interfaceid": "a745dd2e-98f7-4d01-9419-3b72fc35eecb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.416 183755 DEBUG nova.network.os_vif_util [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:35:aa,bridge_name='br-int',has_traffic_filtering=True,id=a745dd2e-98f7-4d01-9419-3b72fc35eecb,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745dd2e-98') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.416 183755 DEBUG os_vif [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:35:aa,bridge_name='br-int',has_traffic_filtering=True,id=a745dd2e-98f7-4d01-9419-3b72fc35eecb,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745dd2e-98') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.418 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.418 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa745dd2e-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.455 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.457 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.458 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.458 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=57bb519a-187b-4718-87da-dd174cfea392) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.459 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.460 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.468 183755 INFO os_vif [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:35:aa,bridge_name='br-int',has_traffic_filtering=True,id=a745dd2e-98f7-4d01-9419-3b72fc35eecb,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa745dd2e-98')
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.469 183755 INFO nova.virt.libvirt.driver [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Deleting instance files /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb_del
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.470 183755 INFO nova.virt.libvirt.driver [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Deletion of /var/lib/nova/instances/4ace501f-e487-4354-a77c-b4c3dde921eb_del complete
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.984 183755 INFO nova.compute.manager [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Took 1.36 seconds to destroy the instance on the hypervisor.
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.984 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.984 183755 DEBUG nova.compute.manager [-] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.985 183755 DEBUG nova.network.neutron [-] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:38:02 compute-1 nova_compute[183751]: 2026-01-27 22:38:02.985 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.093 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.151 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.594 183755 DEBUG nova.compute.manager [req-e2cc818b-99ea-488c-b98c-9eb0e5034a4f req-5c4dfdd9-6b6b-4b06-8aff-8339f43b819d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-vif-deleted-a745dd2e-98f7-4d01-9419-3b72fc35eecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.595 183755 INFO nova.compute.manager [req-e2cc818b-99ea-488c-b98c-9eb0e5034a4f req-5c4dfdd9-6b6b-4b06-8aff-8339f43b819d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Neutron deleted interface a745dd2e-98f7-4d01-9419-3b72fc35eecb; detaching it from the instance and deleting it from the info cache
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.595 183755 DEBUG nova.network.neutron [req-e2cc818b-99ea-488c-b98c-9eb0e5034a4f req-5c4dfdd9-6b6b-4b06-8aff-8339f43b819d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.982 183755 DEBUG nova.compute.manager [req-6779b521-2adf-45e4-bb42-d130e853466f req-dbebbd03-6411-4020-bfbc-d65307132685 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-vif-unplugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.984 183755 DEBUG oslo_concurrency.lockutils [req-6779b521-2adf-45e4-bb42-d130e853466f req-dbebbd03-6411-4020-bfbc-d65307132685 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.985 183755 DEBUG oslo_concurrency.lockutils [req-6779b521-2adf-45e4-bb42-d130e853466f req-dbebbd03-6411-4020-bfbc-d65307132685 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.986 183755 DEBUG oslo_concurrency.lockutils [req-6779b521-2adf-45e4-bb42-d130e853466f req-dbebbd03-6411-4020-bfbc-d65307132685 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.986 183755 DEBUG nova.compute.manager [req-6779b521-2adf-45e4-bb42-d130e853466f req-dbebbd03-6411-4020-bfbc-d65307132685 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] No waiting events found dispatching network-vif-unplugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:38:03 compute-1 nova_compute[183751]: 2026-01-27 22:38:03.986 183755 DEBUG nova.compute.manager [req-6779b521-2adf-45e4-bb42-d130e853466f req-dbebbd03-6411-4020-bfbc-d65307132685 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Received event network-vif-unplugged-a745dd2e-98f7-4d01-9419-3b72fc35eecb for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:38:04 compute-1 nova_compute[183751]: 2026-01-27 22:38:04.030 183755 DEBUG nova.network.neutron [-] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:38:04 compute-1 nova_compute[183751]: 2026-01-27 22:38:04.104 183755 DEBUG nova.compute.manager [req-e2cc818b-99ea-488c-b98c-9eb0e5034a4f req-5c4dfdd9-6b6b-4b06-8aff-8339f43b819d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Detach interface failed, port_id=a745dd2e-98f7-4d01-9419-3b72fc35eecb, reason: Instance 4ace501f-e487-4354-a77c-b4c3dde921eb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 27 22:38:04 compute-1 nova_compute[183751]: 2026-01-27 22:38:04.540 183755 INFO nova.compute.manager [-] [instance: 4ace501f-e487-4354-a77c-b4c3dde921eb] Took 1.56 seconds to deallocate network for instance.
Jan 27 22:38:05 compute-1 nova_compute[183751]: 2026-01-27 22:38:05.059 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:05 compute-1 nova_compute[183751]: 2026-01-27 22:38:05.059 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:05 compute-1 nova_compute[183751]: 2026-01-27 22:38:05.156 183755 DEBUG nova.compute.provider_tree [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:38:05 compute-1 podman[193064]: time="2026-01-27T22:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:38:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:38:05 compute-1 nova_compute[183751]: 2026-01-27 22:38:05.665 183755 DEBUG nova.scheduler.client.report [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:38:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:38:06 compute-1 nova_compute[183751]: 2026-01-27 22:38:06.178 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:06 compute-1 nova_compute[183751]: 2026-01-27 22:38:06.223 183755 INFO nova.scheduler.client.report [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Deleted allocations for instance 4ace501f-e487-4354-a77c-b4c3dde921eb
Jan 27 22:38:07 compute-1 nova_compute[183751]: 2026-01-27 22:38:07.260 183755 DEBUG oslo_concurrency.lockutils [None req-4bd3981d-644c-4a28-a992-523bb782f219 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "4ace501f-e487-4354-a77c-b4c3dde921eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.177s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:07 compute-1 nova_compute[183751]: 2026-01-27 22:38:07.493 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:07 compute-1 nova_compute[183751]: 2026-01-27 22:38:07.659 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.096 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.848 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.849 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.868 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.868 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5825MB free_disk=73.13684844970703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.868 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:08 compute-1 nova_compute[183751]: 2026-01-27 22:38:08.869 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:09 compute-1 nova_compute[183751]: 2026-01-27 22:38:09.913 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:38:09 compute-1 nova_compute[183751]: 2026-01-27 22:38:09.913 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:38:08 up  2:40,  0 user,  load average: 0.22, 0.14, 0.13\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:38:09 compute-1 nova_compute[183751]: 2026-01-27 22:38:09.946 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:38:10 compute-1 nova_compute[183751]: 2026-01-27 22:38:10.455 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:38:10 compute-1 nova_compute[183751]: 2026-01-27 22:38:10.969 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:38:10 compute-1 nova_compute[183751]: 2026-01-27 22:38:10.970 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:11.295 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:11.296 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:11.297 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:11 compute-1 podman[220409]: 2026-01-27 22:38:11.76967328 +0000 UTC m=+0.071391964 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:38:11 compute-1 nova_compute[183751]: 2026-01-27 22:38:11.966 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:11 compute-1 nova_compute[183751]: 2026-01-27 22:38:11.967 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:11 compute-1 nova_compute[183751]: 2026-01-27 22:38:11.967 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:38:12 compute-1 nova_compute[183751]: 2026-01-27 22:38:12.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:12 compute-1 nova_compute[183751]: 2026-01-27 22:38:12.496 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:13 compute-1 nova_compute[183751]: 2026-01-27 22:38:13.098 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:13 compute-1 nova_compute[183751]: 2026-01-27 22:38:13.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:13 compute-1 sshd-session[220433]: Invalid user sol from 80.94.92.186 port 38162
Jan 27 22:38:14 compute-1 sshd-session[220433]: Connection closed by invalid user sol 80.94.92.186 port 38162 [preauth]
Jan 27 22:38:16 compute-1 nova_compute[183751]: 2026-01-27 22:38:16.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:17 compute-1 nova_compute[183751]: 2026-01-27 22:38:17.532 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:18 compute-1 nova_compute[183751]: 2026-01-27 22:38:18.101 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:19 compute-1 openstack_network_exporter[195945]: ERROR   22:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:38:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:38:19 compute-1 openstack_network_exporter[195945]: ERROR   22:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:38:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:38:22 compute-1 nova_compute[183751]: 2026-01-27 22:38:22.534 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:23 compute-1 nova_compute[183751]: 2026-01-27 22:38:23.103 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:27 compute-1 nova_compute[183751]: 2026-01-27 22:38:27.563 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:28 compute-1 nova_compute[183751]: 2026-01-27 22:38:28.106 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:28 compute-1 podman[220435]: 2026-01-27 22:38:28.818908767 +0000 UTC m=+0.119440511 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 27 22:38:32 compute-1 nova_compute[183751]: 2026-01-27 22:38:32.565 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:32 compute-1 podman[220462]: 2026-01-27 22:38:32.765976043 +0000 UTC m=+0.075589138 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 27 22:38:32 compute-1 podman[220461]: 2026-01-27 22:38:32.777744343 +0000 UTC m=+0.085222226 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Jan 27 22:38:33 compute-1 nova_compute[183751]: 2026-01-27 22:38:33.109 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:35 compute-1 podman[193064]: time="2026-01-27T22:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:38:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:38:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Jan 27 22:38:37 compute-1 nova_compute[183751]: 2026-01-27 22:38:37.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:38:37 compute-1 nova_compute[183751]: 2026-01-27 22:38:37.567 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:38 compute-1 nova_compute[183751]: 2026-01-27 22:38:38.110 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:42 compute-1 nova_compute[183751]: 2026-01-27 22:38:42.608 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:42 compute-1 podman[220499]: 2026-01-27 22:38:42.743950444 +0000 UTC m=+0.060271039 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:38:43 compute-1 nova_compute[183751]: 2026-01-27 22:38:43.114 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:44 compute-1 nova_compute[183751]: 2026-01-27 22:38:44.070 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:44 compute-1 nova_compute[183751]: 2026-01-27 22:38:44.070 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:44 compute-1 nova_compute[183751]: 2026-01-27 22:38:44.579 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:38:45 compute-1 nova_compute[183751]: 2026-01-27 22:38:45.140 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:45 compute-1 nova_compute[183751]: 2026-01-27 22:38:45.140 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:45 compute-1 nova_compute[183751]: 2026-01-27 22:38:45.154 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:38:45 compute-1 nova_compute[183751]: 2026-01-27 22:38:45.155 183755 INFO nova.compute.claims [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:38:46 compute-1 nova_compute[183751]: 2026-01-27 22:38:46.224 183755 DEBUG nova.compute.provider_tree [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:38:46 compute-1 nova_compute[183751]: 2026-01-27 22:38:46.732 183755 DEBUG nova.scheduler.client.report [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:38:47 compute-1 nova_compute[183751]: 2026-01-27 22:38:47.251 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:47 compute-1 nova_compute[183751]: 2026-01-27 22:38:47.252 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:38:47 compute-1 nova_compute[183751]: 2026-01-27 22:38:47.669 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:47 compute-1 nova_compute[183751]: 2026-01-27 22:38:47.767 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:38:47 compute-1 nova_compute[183751]: 2026-01-27 22:38:47.767 183755 DEBUG nova.network.neutron [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:38:47 compute-1 nova_compute[183751]: 2026-01-27 22:38:47.768 183755 WARNING neutronclient.v2_0.client [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:38:47 compute-1 nova_compute[183751]: 2026-01-27 22:38:47.768 183755 WARNING neutronclient.v2_0.client [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:38:48 compute-1 nova_compute[183751]: 2026-01-27 22:38:48.116 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:48 compute-1 nova_compute[183751]: 2026-01-27 22:38:48.277 183755 INFO nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:38:48 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:48.541 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:38:48 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:48.541 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:38:48 compute-1 nova_compute[183751]: 2026-01-27 22:38:48.542 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:48 compute-1 nova_compute[183751]: 2026-01-27 22:38:48.699 183755 DEBUG nova.network.neutron [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Successfully created port: c5f96afb-6844-4f0e-8162-c226b35de01d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:38:48 compute-1 nova_compute[183751]: 2026-01-27 22:38:48.786 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:38:49 compute-1 openstack_network_exporter[195945]: ERROR   22:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:38:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:38:49 compute-1 openstack_network_exporter[195945]: ERROR   22:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:38:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.806 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.808 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.808 183755 INFO nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Creating image(s)
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.808 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "/var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.809 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "/var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.809 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "/var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.810 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.812 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.814 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.901 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.902 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.902 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.903 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.906 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.907 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.973 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:38:49 compute-1 nova_compute[183751]: 2026-01-27 22:38:49.974 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.008 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.009 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.010 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.081 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.081 183755 DEBUG nova.virt.disk.api [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Checking if we can resize image /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.082 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.135 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.136 183755 DEBUG nova.virt.disk.api [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Cannot resize image /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.136 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.137 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Ensure instance console log exists: /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.137 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.137 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.138 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.597 183755 DEBUG nova.network.neutron [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Successfully updated port: c5f96afb-6844-4f0e-8162-c226b35de01d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.663 183755 DEBUG nova.compute.manager [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-changed-c5f96afb-6844-4f0e-8162-c226b35de01d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.663 183755 DEBUG nova.compute.manager [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Refreshing instance network info cache due to event network-changed-c5f96afb-6844-4f0e-8162-c226b35de01d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.663 183755 DEBUG oslo_concurrency.lockutils [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-aa7a18b8-946b-4b47-abf0-c146699d2c11" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.663 183755 DEBUG oslo_concurrency.lockutils [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-aa7a18b8-946b-4b47-abf0-c146699d2c11" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:38:50 compute-1 nova_compute[183751]: 2026-01-27 22:38:50.664 183755 DEBUG nova.network.neutron [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Refreshing network info cache for port c5f96afb-6844-4f0e-8162-c226b35de01d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:38:51 compute-1 nova_compute[183751]: 2026-01-27 22:38:51.106 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "refresh_cache-aa7a18b8-946b-4b47-abf0-c146699d2c11" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:38:51 compute-1 nova_compute[183751]: 2026-01-27 22:38:51.169 183755 WARNING neutronclient.v2_0.client [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:38:51 compute-1 nova_compute[183751]: 2026-01-27 22:38:51.339 183755 DEBUG nova.network.neutron [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:38:51 compute-1 nova_compute[183751]: 2026-01-27 22:38:51.476 183755 DEBUG nova.network.neutron [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:38:51 compute-1 nova_compute[183751]: 2026-01-27 22:38:51.983 183755 DEBUG oslo_concurrency.lockutils [req-1845c3c7-fa9c-4dd1-8cd7-be1a8b7d3e8c req-4780d0b2-413c-430a-878d-d71c48c66870 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-aa7a18b8-946b-4b47-abf0-c146699d2c11" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:38:51 compute-1 nova_compute[183751]: 2026-01-27 22:38:51.985 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquired lock "refresh_cache-aa7a18b8-946b-4b47-abf0-c146699d2c11" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:38:51 compute-1 nova_compute[183751]: 2026-01-27 22:38:51.985 183755 DEBUG nova.network.neutron [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:38:52 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:52.543 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:52 compute-1 nova_compute[183751]: 2026-01-27 22:38:52.707 183755 DEBUG nova.network.neutron [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:38:52 compute-1 nova_compute[183751]: 2026-01-27 22:38:52.709 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:52 compute-1 nova_compute[183751]: 2026-01-27 22:38:52.942 183755 WARNING neutronclient.v2_0.client [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.118 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.140 183755 DEBUG nova.network.neutron [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Updating instance_info_cache with network_info: [{"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.648 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Releasing lock "refresh_cache-aa7a18b8-946b-4b47-abf0-c146699d2c11" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.649 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Instance network_info: |[{"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.652 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Start _get_guest_xml network_info=[{"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.656 183755 WARNING nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.658 183755 DEBUG nova.virt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1191134516', uuid='aa7a18b8-946b-4b47-abf0-c146699d2c11'), owner=OwnerMeta(userid='84404785aedd471590f8ac69cbbb69db', username='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin', projectid='f85db165fdfb4bf4a093051065554230', projectname='tempest-TestExecuteZoneMigrationStrategy-1358952714'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769553533.6582015) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.666 183755 DEBUG nova.virt.libvirt.host [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.666 183755 DEBUG nova.virt.libvirt.host [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.672 183755 DEBUG nova.virt.libvirt.host [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.672 183755 DEBUG nova.virt.libvirt.host [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.674 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.674 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.674 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.675 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.675 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.675 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.675 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.676 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.676 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.676 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.677 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.677 183755 DEBUG nova.virt.hardware [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.682 183755 DEBUG nova.virt.libvirt.vif [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:38:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1191134516',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1191134516',id=26,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f85db165fdfb4bf4a093051065554230',ramdisk_id='',reservation_id='r-qtc2emtx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1358952714',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:38:48Z,user_data=None,user_id='84404785aedd471590f8ac69cbbb69db',uuid=aa7a18b8-946b-4b47-abf0-c146699d2c11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.682 183755 DEBUG nova.network.os_vif_util [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converting VIF {"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.683 183755 DEBUG nova.network.os_vif_util [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:43:42,bridge_name='br-int',has_traffic_filtering=True,id=c5f96afb-6844-4f0e-8162-c226b35de01d,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f96afb-68') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:38:53 compute-1 nova_compute[183751]: 2026-01-27 22:38:53.684 183755 DEBUG nova.objects.instance [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lazy-loading 'pci_devices' on Instance uuid aa7a18b8-946b-4b47-abf0-c146699d2c11 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.194 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <uuid>aa7a18b8-946b-4b47-abf0-c146699d2c11</uuid>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <name>instance-0000001a</name>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1191134516</nova:name>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:38:53</nova:creationTime>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:38:54 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:38:54 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:user uuid="84404785aedd471590f8ac69cbbb69db">tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin</nova:user>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:project uuid="f85db165fdfb4bf4a093051065554230">tempest-TestExecuteZoneMigrationStrategy-1358952714</nova:project>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         <nova:port uuid="c5f96afb-6844-4f0e-8162-c226b35de01d">
Jan 27 22:38:54 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <system>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <entry name="serial">aa7a18b8-946b-4b47-abf0-c146699d2c11</entry>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <entry name="uuid">aa7a18b8-946b-4b47-abf0-c146699d2c11</entry>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </system>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <os>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   </os>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <features>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   </features>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk.config"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:f3:43:42"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <target dev="tapc5f96afb-68"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/console.log" append="off"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <video>
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </video>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:38:54 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:38:54 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:38:54 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:38:54 compute-1 nova_compute[183751]: </domain>
Jan 27 22:38:54 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.196 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Preparing to wait for external event network-vif-plugged-c5f96afb-6844-4f0e-8162-c226b35de01d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.196 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.196 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.196 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.197 183755 DEBUG nova.virt.libvirt.vif [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:38:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1191134516',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1191134516',id=26,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f85db165fdfb4bf4a093051065554230',ramdisk_id='',reservation_id='r-qtc2emtx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1358952714',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:38:48Z,user_data=None,user_id='84404785aedd471590f8ac69cbbb69db',uuid=aa7a18b8-946b-4b47-abf0-c146699d2c11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.198 183755 DEBUG nova.network.os_vif_util [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converting VIF {"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.198 183755 DEBUG nova.network.os_vif_util [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:43:42,bridge_name='br-int',has_traffic_filtering=True,id=c5f96afb-6844-4f0e-8162-c226b35de01d,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f96afb-68') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.199 183755 DEBUG os_vif [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:43:42,bridge_name='br-int',has_traffic_filtering=True,id=c5f96afb-6844-4f0e-8162-c226b35de01d,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f96afb-68') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.199 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.200 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.200 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.201 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.201 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '556aa069-cbe6-5786-a14b-7cd2a0822afc', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.203 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.204 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.205 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.209 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.210 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5f96afb-68, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.210 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc5f96afb-68, col_values=(('qos', UUID('dbfe6175-1a55-46a5-97ca-33a59f05e75e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.211 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc5f96afb-68, col_values=(('external_ids', {'iface-id': 'c5f96afb-6844-4f0e-8162-c226b35de01d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:43:42', 'vm-uuid': 'aa7a18b8-946b-4b47-abf0-c146699d2c11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.212 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:54 compute-1 NetworkManager[56069]: <info>  [1769553534.2146] manager: (tapc5f96afb-68): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.215 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.220 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:54 compute-1 nova_compute[183751]: 2026-01-27 22:38:54.221 183755 INFO os_vif [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:43:42,bridge_name='br-int',has_traffic_filtering=True,id=c5f96afb-6844-4f0e-8162-c226b35de01d,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f96afb-68')
Jan 27 22:38:55 compute-1 nova_compute[183751]: 2026-01-27 22:38:55.785 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:38:55 compute-1 nova_compute[183751]: 2026-01-27 22:38:55.786 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:38:55 compute-1 nova_compute[183751]: 2026-01-27 22:38:55.787 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] No VIF found with MAC fa:16:3e:f3:43:42, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:38:55 compute-1 nova_compute[183751]: 2026-01-27 22:38:55.787 183755 INFO nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Using config drive
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.298 183755 WARNING neutronclient.v2_0.client [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.487 183755 INFO nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Creating config drive at /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk.config
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.493 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpu_dd5vmu execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.629 183755 DEBUG oslo_concurrency.processutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpu_dd5vmu" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:38:56 compute-1 kernel: tapc5f96afb-68: entered promiscuous mode
Jan 27 22:38:56 compute-1 NetworkManager[56069]: <info>  [1769553536.7410] manager: (tapc5f96afb-68): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.740 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:56 compute-1 ovn_controller[95969]: 2026-01-27T22:38:56Z|00112|binding|INFO|Claiming lport c5f96afb-6844-4f0e-8162-c226b35de01d for this chassis.
Jan 27 22:38:56 compute-1 ovn_controller[95969]: 2026-01-27T22:38:56Z|00113|binding|INFO|c5f96afb-6844-4f0e-8162-c226b35de01d: Claiming fa:16:3e:f3:43:42 10.100.0.13
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.750 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:43:42 10.100.0.13'], port_security=['fa:16:3e:f3:43:42 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'aa7a18b8-946b-4b47-abf0-c146699d2c11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=c5f96afb-6844-4f0e-8162-c226b35de01d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.751 105247 INFO neutron.agent.ovn.metadata.agent [-] Port c5f96afb-6844-4f0e-8162-c226b35de01d in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 bound to our chassis
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.752 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24d39604-44db-4002-b4d8-7ef0b15b5533
Jan 27 22:38:56 compute-1 ovn_controller[95969]: 2026-01-27T22:38:56Z|00114|binding|INFO|Setting lport c5f96afb-6844-4f0e-8162-c226b35de01d ovn-installed in OVS
Jan 27 22:38:56 compute-1 ovn_controller[95969]: 2026-01-27T22:38:56Z|00115|binding|INFO|Setting lport c5f96afb-6844-4f0e-8162-c226b35de01d up in Southbound
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.758 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.758 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.763 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:56 compute-1 nova_compute[183751]: 2026-01-27 22:38:56.767 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.772 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[d70919b5-3c28-4a72-9dec-7023eefc29f8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.773 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24d39604-41 in ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.775 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24d39604-40 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.776 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[a387732a-2dc2-4580-932b-eac350260fc5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.777 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[64b89061-8c03-42d9-8ab5-e4efb6ec9d8d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.791 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b56f77-af8a-4300-8130-e28c8d549610]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 systemd-udevd[220562]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:38:56 compute-1 systemd-machined[155034]: New machine qemu-9-instance-0000001a.
Jan 27 22:38:56 compute-1 NetworkManager[56069]: <info>  [1769553536.8109] device (tapc5f96afb-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:38:56 compute-1 NetworkManager[56069]: <info>  [1769553536.8120] device (tapc5f96afb-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.811 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6f62250e-4cf7-47ad-930f-b69c808c78e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 systemd[1]: Started Virtual Machine qemu-9-instance-0000001a.
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.856 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[db97c1e7-4a95-4132-aada-47bd300f2878]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.861 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7f76e2f0-b0a1-4722-ad21-2487d52d80c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 NetworkManager[56069]: <info>  [1769553536.8627] manager: (tap24d39604-40): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.912 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[22292be0-1eee-4e14-9833-7a41d16f1036]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.916 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[36c0360b-a729-41e0-a78a-16c4f5e5b6c6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 NetworkManager[56069]: <info>  [1769553536.9555] device (tap24d39604-40): carrier: link connected
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.965 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a76ec2-1e98-4fe7-bda2-a05685ef6c74]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:56 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:56.992 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f9dd5617-4fc6-4a06-afdc-6135f3400506]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24d39604-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:91:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 967922, 'reachable_time': 18548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220593, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.017 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[aa791cd4-651a-406a-9dd0-a766c6470140]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:915d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 967922, 'tstamp': 967922}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220594, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.043 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[f6fb3714-a90c-4293-b492-98d15d9355ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24d39604-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:91:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 967922, 'reachable_time': 18548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220595, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.087 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[d9550ebd-dd99-4cdd-abb8-5efe58058693]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.163 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[87900b52-46b8-4880-ad15-247422c698c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.164 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24d39604-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.165 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.166 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24d39604-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.167 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:57 compute-1 NetworkManager[56069]: <info>  [1769553537.1684] manager: (tap24d39604-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 27 22:38:57 compute-1 kernel: tap24d39604-40: entered promiscuous mode
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.170 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.170 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24d39604-40, col_values=(('external_ids', {'iface-id': 'd5030a57-5c97-4207-a276-f5ed6938ad09'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.171 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:57 compute-1 ovn_controller[95969]: 2026-01-27T22:38:57Z|00116|binding|INFO|Releasing lport d5030a57-5c97-4207-a276-f5ed6938ad09 from this chassis (sb_readonly=0)
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.174 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec77f42-6abf-4388-a79f-dcaf93a60c66]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.174 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.175 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.175 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 24d39604-44db-4002-b4d8-7ef0b15b5533 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.175 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.175 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7fae9a3f-b828-46eb-a886-65ab00335143]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.176 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.176 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[40026bae-cd10-41a9-b31e-e9ea3c98e578]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.177 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-24d39604-44db-4002-b4d8-7ef0b15b5533
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID 24d39604-44db-4002-b4d8-7ef0b15b5533
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:38:57 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:38:57.177 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'env', 'PROCESS_TAG=haproxy-24d39604-44db-4002-b4d8-7ef0b15b5533', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24d39604-44db-4002-b4d8-7ef0b15b5533.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.186 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:57 compute-1 podman[220634]: 2026-01-27 22:38:57.612657141 +0000 UTC m=+0.070009311 container create 80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Jan 27 22:38:57 compute-1 systemd[1]: Started libpod-conmon-80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60.scope.
Jan 27 22:38:57 compute-1 podman[220634]: 2026-01-27 22:38:57.573388061 +0000 UTC m=+0.030740251 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:38:57 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:38:57 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54641fa701e2592ef6787d677ef657f0dcb815bc2744dc561cfddc78f6f8c5a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:38:57 compute-1 podman[220634]: 2026-01-27 22:38:57.720544236 +0000 UTC m=+0.177896406 container init 80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Jan 27 22:38:57 compute-1 podman[220634]: 2026-01-27 22:38:57.725680332 +0000 UTC m=+0.183032472 container start 80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126)
Jan 27 22:38:57 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220649]: [NOTICE]   (220653) : New worker (220655) forked
Jan 27 22:38:57 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220649]: [NOTICE]   (220653) : Loading success.
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.901 183755 DEBUG nova.compute.manager [req-cecd5434-2ad7-47dc-8975-7b1ffa0d18d3 req-dcc060f7-6760-4e0a-90bb-b9a8751330b6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-vif-plugged-c5f96afb-6844-4f0e-8162-c226b35de01d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.902 183755 DEBUG oslo_concurrency.lockutils [req-cecd5434-2ad7-47dc-8975-7b1ffa0d18d3 req-dcc060f7-6760-4e0a-90bb-b9a8751330b6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.903 183755 DEBUG oslo_concurrency.lockutils [req-cecd5434-2ad7-47dc-8975-7b1ffa0d18d3 req-dcc060f7-6760-4e0a-90bb-b9a8751330b6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.904 183755 DEBUG oslo_concurrency.lockutils [req-cecd5434-2ad7-47dc-8975-7b1ffa0d18d3 req-dcc060f7-6760-4e0a-90bb-b9a8751330b6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.904 183755 DEBUG nova.compute.manager [req-cecd5434-2ad7-47dc-8975-7b1ffa0d18d3 req-dcc060f7-6760-4e0a-90bb-b9a8751330b6 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Processing event network-vif-plugged-c5f96afb-6844-4f0e-8162-c226b35de01d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.906 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.913 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.918 183755 INFO nova.virt.libvirt.driver [-] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Instance spawned successfully.
Jan 27 22:38:57 compute-1 nova_compute[183751]: 2026-01-27 22:38:57.919 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.121 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.434 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.436 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.437 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.438 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.438 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.439 183755 DEBUG nova.virt.libvirt.driver [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.954 183755 INFO nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Took 9.15 seconds to spawn the instance on the hypervisor.
Jan 27 22:38:58 compute-1 nova_compute[183751]: 2026-01-27 22:38:58.955 183755 DEBUG nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.213 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.490 183755 INFO nova.compute.manager [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Took 14.40 seconds to build instance.
Jan 27 22:38:59 compute-1 podman[220664]: 2026-01-27 22:38:59.884297553 +0000 UTC m=+0.179805883 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.978 183755 DEBUG nova.compute.manager [req-9f1d9b30-3e33-40fa-9011-d31b8eab1007 req-41d5d8ef-79cb-458d-9f93-f5c38f510de1 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-vif-plugged-c5f96afb-6844-4f0e-8162-c226b35de01d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.978 183755 DEBUG oslo_concurrency.lockutils [req-9f1d9b30-3e33-40fa-9011-d31b8eab1007 req-41d5d8ef-79cb-458d-9f93-f5c38f510de1 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.979 183755 DEBUG oslo_concurrency.lockutils [req-9f1d9b30-3e33-40fa-9011-d31b8eab1007 req-41d5d8ef-79cb-458d-9f93-f5c38f510de1 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.979 183755 DEBUG oslo_concurrency.lockutils [req-9f1d9b30-3e33-40fa-9011-d31b8eab1007 req-41d5d8ef-79cb-458d-9f93-f5c38f510de1 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.980 183755 DEBUG nova.compute.manager [req-9f1d9b30-3e33-40fa-9011-d31b8eab1007 req-41d5d8ef-79cb-458d-9f93-f5c38f510de1 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] No waiting events found dispatching network-vif-plugged-c5f96afb-6844-4f0e-8162-c226b35de01d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:38:59 compute-1 nova_compute[183751]: 2026-01-27 22:38:59.980 183755 WARNING nova.compute.manager [req-9f1d9b30-3e33-40fa-9011-d31b8eab1007 req-41d5d8ef-79cb-458d-9f93-f5c38f510de1 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received unexpected event network-vif-plugged-c5f96afb-6844-4f0e-8162-c226b35de01d for instance with vm_state active and task_state None.
Jan 27 22:39:00 compute-1 nova_compute[183751]: 2026-01-27 22:39:00.002 183755 DEBUG oslo_concurrency.lockutils [None req-5b740e45-ce9d-429f-b349-11c8080a7f10 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.932s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.123 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.342 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.343 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.344 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.345 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.346 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.388 183755 INFO nova.compute.manager [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Terminating instance
Jan 27 22:39:03 compute-1 podman[220687]: 2026-01-27 22:39:03.761451561 +0000 UTC m=+0.074648105 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Jan 27 22:39:03 compute-1 podman[220688]: 2026-01-27 22:39:03.779662001 +0000 UTC m=+0.084850287 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 27 22:39:03 compute-1 nova_compute[183751]: 2026-01-27 22:39:03.918 183755 DEBUG nova.compute.manager [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:39:03 compute-1 kernel: tapc5f96afb-68 (unregistering): left promiscuous mode
Jan 27 22:39:03 compute-1 NetworkManager[56069]: <info>  [1769553543.9434] device (tapc5f96afb-68): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.007 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 ovn_controller[95969]: 2026-01-27T22:39:04Z|00117|binding|INFO|Releasing lport c5f96afb-6844-4f0e-8162-c226b35de01d from this chassis (sb_readonly=0)
Jan 27 22:39:04 compute-1 ovn_controller[95969]: 2026-01-27T22:39:04Z|00118|binding|INFO|Setting lport c5f96afb-6844-4f0e-8162-c226b35de01d down in Southbound
Jan 27 22:39:04 compute-1 ovn_controller[95969]: 2026-01-27T22:39:04Z|00119|binding|INFO|Removing iface tapc5f96afb-68 ovn-installed in OVS
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.011 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.016 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:43:42 10.100.0.13'], port_security=['fa:16:3e:f3:43:42 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'aa7a18b8-946b-4b47-abf0-c146699d2c11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '5', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=c5f96afb-6844-4f0e-8162-c226b35de01d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.018 105247 INFO neutron.agent.ovn.metadata.agent [-] Port c5f96afb-6844-4f0e-8162-c226b35de01d in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 unbound from our chassis
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.019 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24d39604-44db-4002-b4d8-7ef0b15b5533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.021 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.021 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[4db49c37-f8c6-46c9-bdc4-84c4ab2e6ad3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.021 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 namespace which is not needed anymore
Jan 27 22:39:04 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 27 22:39:04 compute-1 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001a.scope: Consumed 6.551s CPU time.
Jan 27 22:39:04 compute-1 systemd-machined[155034]: Machine qemu-9-instance-0000001a terminated.
Jan 27 22:39:04 compute-1 kernel: tapc5f96afb-68: entered promiscuous mode
Jan 27 22:39:04 compute-1 kernel: tapc5f96afb-68 (unregistering): left promiscuous mode
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.149 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 ovn_controller[95969]: 2026-01-27T22:39:04Z|00120|binding|INFO|Claiming lport c5f96afb-6844-4f0e-8162-c226b35de01d for this chassis.
Jan 27 22:39:04 compute-1 ovn_controller[95969]: 2026-01-27T22:39:04Z|00121|binding|INFO|c5f96afb-6844-4f0e-8162-c226b35de01d: Claiming fa:16:3e:f3:43:42 10.100.0.13
Jan 27 22:39:04 compute-1 ovn_controller[95969]: 2026-01-27T22:39:04Z|00122|if_status|INFO|Not setting lport c5f96afb-6844-4f0e-8162-c226b35de01d down as sb is readonly
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.162 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 ovn_controller[95969]: 2026-01-27T22:39:04Z|00123|binding|INFO|Releasing lport c5f96afb-6844-4f0e-8162-c226b35de01d from this chassis (sb_readonly=0)
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.172 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:43:42 10.100.0.13'], port_security=['fa:16:3e:f3:43:42 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'aa7a18b8-946b-4b47-abf0-c146699d2c11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '5', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=c5f96afb-6844-4f0e-8162-c226b35de01d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.176 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:43:42 10.100.0.13'], port_security=['fa:16:3e:f3:43:42 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'aa7a18b8-946b-4b47-abf0-c146699d2c11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24d39604-44db-4002-b4d8-7ef0b15b5533', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f85db165fdfb4bf4a093051065554230', 'neutron:revision_number': '5', 'neutron:security_group_ids': '95cb9870-b574-46eb-94c8-ffb9aee48a06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f8e41a7-2994-4b63-aa2e-14041916e5f8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=c5f96afb-6844-4f0e-8162-c226b35de01d) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:39:04 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220649]: [NOTICE]   (220653) : haproxy version is 3.0.5-8e879a5
Jan 27 22:39:04 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220649]: [NOTICE]   (220653) : path to executable is /usr/sbin/haproxy
Jan 27 22:39:04 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220649]: [WARNING]  (220653) : Exiting Master process...
Jan 27 22:39:04 compute-1 podman[220750]: 2026-01-27 22:39:04.177819494 +0000 UTC m=+0.043392022 container kill 80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 22:39:04 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220649]: [ALERT]    (220653) : Current worker (220655) exited with code 143 (Terminated)
Jan 27 22:39:04 compute-1 neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533[220649]: [WARNING]  (220653) : All workers exited. Exiting... (0)
Jan 27 22:39:04 compute-1 systemd[1]: libpod-80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60.scope: Deactivated successfully.
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.198 183755 INFO nova.virt.libvirt.driver [-] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Instance destroyed successfully.
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.198 183755 DEBUG nova.objects.instance [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lazy-loading 'resources' on Instance uuid aa7a18b8-946b-4b47-abf0-c146699d2c11 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.215 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 podman[220773]: 2026-01-27 22:39:04.235996451 +0000 UTC m=+0.029763496 container died 80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260126)
Jan 27 22:39:04 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60-userdata-shm.mount: Deactivated successfully.
Jan 27 22:39:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-54641fa701e2592ef6787d677ef657f0dcb815bc2744dc561cfddc78f6f8c5a6-merged.mount: Deactivated successfully.
Jan 27 22:39:04 compute-1 podman[220773]: 2026-01-27 22:39:04.274203575 +0000 UTC m=+0.067970600 container cleanup 80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 27 22:39:04 compute-1 systemd[1]: libpod-conmon-80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60.scope: Deactivated successfully.
Jan 27 22:39:04 compute-1 podman[220775]: 2026-01-27 22:39:04.295580613 +0000 UTC m=+0.080568121 container remove 80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.302 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb2c62b-d220-4728-b28a-0ebaf2d1e4c7]: (4, ("Tue Jan 27 10:39:04 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 (80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60)\n80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60\nTue Jan 27 10:39:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 (80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60)\n80444624bdf8e9cc3bedc29a856ed72079f0fb06af56b82f416ceb852f7aca60\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.304 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[eece75dc-ab4f-4d43-91e0-06c1581ed6ac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.304 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24d39604-44db-4002-b4d8-7ef0b15b5533.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.305 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[83afd511-5e1f-4271-9ffa-9b185b025803]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.306 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24d39604-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.308 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 kernel: tap24d39604-40: left promiscuous mode
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.318 183755 DEBUG nova.compute.manager [req-25b999c5-8adc-4c7f-b181-51862ab9c87a req-0c9ba99d-4c79-49b6-8d1d-a406e16e3173 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-vif-unplugged-c5f96afb-6844-4f0e-8162-c226b35de01d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.319 183755 DEBUG oslo_concurrency.lockutils [req-25b999c5-8adc-4c7f-b181-51862ab9c87a req-0c9ba99d-4c79-49b6-8d1d-a406e16e3173 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.319 183755 DEBUG oslo_concurrency.lockutils [req-25b999c5-8adc-4c7f-b181-51862ab9c87a req-0c9ba99d-4c79-49b6-8d1d-a406e16e3173 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.319 183755 DEBUG oslo_concurrency.lockutils [req-25b999c5-8adc-4c7f-b181-51862ab9c87a req-0c9ba99d-4c79-49b6-8d1d-a406e16e3173 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.320 183755 DEBUG nova.compute.manager [req-25b999c5-8adc-4c7f-b181-51862ab9c87a req-0c9ba99d-4c79-49b6-8d1d-a406e16e3173 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] No waiting events found dispatching network-vif-unplugged-c5f96afb-6844-4f0e-8162-c226b35de01d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.320 183755 DEBUG nova.compute.manager [req-25b999c5-8adc-4c7f-b181-51862ab9c87a req-0c9ba99d-4c79-49b6-8d1d-a406e16e3173 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-vif-unplugged-c5f96afb-6844-4f0e-8162-c226b35de01d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.337 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.340 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7aff39c2-d3b4-420c-894a-2198931965fe]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.354 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2c63eb-8178-4475-98d1-80e7895b0c49]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.354 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[e93a6601-b7a0-4fd0-b6a7-d6be6b98de6b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.375 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[13199124-ec35-49a5-a16a-6060ba54db85]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 967911, 'reachable_time': 37636, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220809, 'error': None, 'target': 'ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.377 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24d39604-44db-4002-b4d8-7ef0b15b5533 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.377 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[0147f389-9b70-4ca8-97df-9094b2ebe5fb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.378 105247 INFO neutron.agent.ovn.metadata.agent [-] Port c5f96afb-6844-4f0e-8162-c226b35de01d in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 unbound from our chassis
Jan 27 22:39:04 compute-1 systemd[1]: run-netns-ovnmeta\x2d24d39604\x2d44db\x2d4002\x2db4d8\x2d7ef0b15b5533.mount: Deactivated successfully.
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.379 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24d39604-44db-4002-b4d8-7ef0b15b5533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.380 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[7c941388-c6e9-40c2-8150-c02715c398a7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.380 105247 INFO neutron.agent.ovn.metadata.agent [-] Port c5f96afb-6844-4f0e-8162-c226b35de01d in datapath 24d39604-44db-4002-b4d8-7ef0b15b5533 unbound from our chassis
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.381 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24d39604-44db-4002-b4d8-7ef0b15b5533, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:39:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:04.381 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a6d35a-e8fc-48dc-bce6-f3daba0631e6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.706 183755 DEBUG nova.virt.libvirt.vif [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:38:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1191134516',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1191134516',id=26,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:38:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f85db165fdfb4bf4a093051065554230',ramdisk_id='',reservation_id='r-qtc2emtx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member,manager',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1358952714',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1358952714-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:38:58Z,user_data=None,user_id='84404785aedd471590f8ac69cbbb69db',uuid=aa7a18b8-946b-4b47-abf0-c146699d2c11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.707 183755 DEBUG nova.network.os_vif_util [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converting VIF {"id": "c5f96afb-6844-4f0e-8162-c226b35de01d", "address": "fa:16:3e:f3:43:42", "network": {"id": "24d39604-44db-4002-b4d8-7ef0b15b5533", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1143538185-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a0500b54ccc240539a97125971e92178", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f96afb-68", "ovs_interfaceid": "c5f96afb-6844-4f0e-8162-c226b35de01d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.708 183755 DEBUG nova.network.os_vif_util [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:43:42,bridge_name='br-int',has_traffic_filtering=True,id=c5f96afb-6844-4f0e-8162-c226b35de01d,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f96afb-68') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.708 183755 DEBUG os_vif [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:43:42,bridge_name='br-int',has_traffic_filtering=True,id=c5f96afb-6844-4f0e-8162-c226b35de01d,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f96afb-68') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.710 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.710 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5f96afb-68, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.712 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.714 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.715 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.715 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=dbfe6175-1a55-46a5-97ca-33a59f05e75e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.716 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.717 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.721 183755 INFO os_vif [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:43:42,bridge_name='br-int',has_traffic_filtering=True,id=c5f96afb-6844-4f0e-8162-c226b35de01d,network=Network(24d39604-44db-4002-b4d8-7ef0b15b5533),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f96afb-68')
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.721 183755 INFO nova.virt.libvirt.driver [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Deleting instance files /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11_del
Jan 27 22:39:04 compute-1 nova_compute[183751]: 2026-01-27 22:39:04.722 183755 INFO nova.virt.libvirt.driver [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Deletion of /var/lib/nova/instances/aa7a18b8-946b-4b47-abf0-c146699d2c11_del complete
Jan 27 22:39:05 compute-1 nova_compute[183751]: 2026-01-27 22:39:05.235 183755 INFO nova.compute.manager [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Took 1.32 seconds to destroy the instance on the hypervisor.
Jan 27 22:39:05 compute-1 nova_compute[183751]: 2026-01-27 22:39:05.236 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:39:05 compute-1 nova_compute[183751]: 2026-01-27 22:39:05.236 183755 DEBUG nova.compute.manager [-] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:39:05 compute-1 nova_compute[183751]: 2026-01-27 22:39:05.237 183755 DEBUG nova.network.neutron [-] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:39:05 compute-1 nova_compute[183751]: 2026-01-27 22:39:05.237 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:39:05 compute-1 podman[193064]: time="2026-01-27T22:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:39:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:39:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:39:05 compute-1 nova_compute[183751]: 2026-01-27 22:39:05.711 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.113 183755 DEBUG nova.compute.manager [req-99639887-2449-4e14-814d-cad4cc1ddf90 req-213a947e-4472-4889-ba07-434686585ede 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-vif-deleted-c5f96afb-6844-4f0e-8162-c226b35de01d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.114 183755 INFO nova.compute.manager [req-99639887-2449-4e14-814d-cad4cc1ddf90 req-213a947e-4472-4889-ba07-434686585ede 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Neutron deleted interface c5f96afb-6844-4f0e-8162-c226b35de01d; detaching it from the instance and deleting it from the info cache
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.114 183755 DEBUG nova.network.neutron [req-99639887-2449-4e14-814d-cad4cc1ddf90 req-213a947e-4472-4889-ba07-434686585ede 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.418 183755 DEBUG nova.compute.manager [req-3543aa12-d389-459b-99fa-5b92e6c77bdb req-2e46645b-b980-4310-87a4-681c5a29868d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-vif-unplugged-c5f96afb-6844-4f0e-8162-c226b35de01d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.419 183755 DEBUG oslo_concurrency.lockutils [req-3543aa12-d389-459b-99fa-5b92e6c77bdb req-2e46645b-b980-4310-87a4-681c5a29868d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.420 183755 DEBUG oslo_concurrency.lockutils [req-3543aa12-d389-459b-99fa-5b92e6c77bdb req-2e46645b-b980-4310-87a4-681c5a29868d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.420 183755 DEBUG oslo_concurrency.lockutils [req-3543aa12-d389-459b-99fa-5b92e6c77bdb req-2e46645b-b980-4310-87a4-681c5a29868d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.421 183755 DEBUG nova.compute.manager [req-3543aa12-d389-459b-99fa-5b92e6c77bdb req-2e46645b-b980-4310-87a4-681c5a29868d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] No waiting events found dispatching network-vif-unplugged-c5f96afb-6844-4f0e-8162-c226b35de01d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.421 183755 DEBUG nova.compute.manager [req-3543aa12-d389-459b-99fa-5b92e6c77bdb req-2e46645b-b980-4310-87a4-681c5a29868d 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Received event network-vif-unplugged-c5f96afb-6844-4f0e-8162-c226b35de01d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.556 183755 DEBUG nova.network.neutron [-] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:39:06 compute-1 nova_compute[183751]: 2026-01-27 22:39:06.623 183755 DEBUG nova.compute.manager [req-99639887-2449-4e14-814d-cad4cc1ddf90 req-213a947e-4472-4889-ba07-434686585ede 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Detach interface failed, port_id=c5f96afb-6844-4f0e-8162-c226b35de01d, reason: Instance aa7a18b8-946b-4b47-abf0-c146699d2c11 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Jan 27 22:39:07 compute-1 nova_compute[183751]: 2026-01-27 22:39:07.064 183755 INFO nova.compute.manager [-] [instance: aa7a18b8-946b-4b47-abf0-c146699d2c11] Took 1.83 seconds to deallocate network for instance.
Jan 27 22:39:07 compute-1 nova_compute[183751]: 2026-01-27 22:39:07.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:07 compute-1 nova_compute[183751]: 2026-01-27 22:39:07.590 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:07 compute-1 nova_compute[183751]: 2026-01-27 22:39:07.591 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:07 compute-1 nova_compute[183751]: 2026-01-27 22:39:07.706 183755 DEBUG nova.compute.provider_tree [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:39:08 compute-1 nova_compute[183751]: 2026-01-27 22:39:08.126 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:08 compute-1 nova_compute[183751]: 2026-01-27 22:39:08.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:08 compute-1 nova_compute[183751]: 2026-01-27 22:39:08.216 183755 DEBUG nova.scheduler.client.report [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:39:08 compute-1 nova_compute[183751]: 2026-01-27 22:39:08.730 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.139s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:08 compute-1 nova_compute[183751]: 2026-01-27 22:39:08.767 183755 INFO nova.scheduler.client.report [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Deleted allocations for instance aa7a18b8-946b-4b47-abf0-c146699d2c11
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.717 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.804 183755 DEBUG oslo_concurrency.lockutils [None req-48d77dba-e9cd-4083-960b-f613ed88d07b 84404785aedd471590f8ac69cbbb69db f85db165fdfb4bf4a093051065554230 - - default default] Lock "aa7a18b8-946b-4b47-abf0-c146699d2c11" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.460s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.915 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.917 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.950 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.951 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5815MB free_disk=73.136962890625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.952 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:09 compute-1 nova_compute[183751]: 2026-01-27 22:39:09.952 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:10 compute-1 nova_compute[183751]: 2026-01-27 22:39:10.997 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:39:10 compute-1 nova_compute[183751]: 2026-01-27 22:39:10.998 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:39:09 up  2:41,  0 user,  load average: 0.45, 0.20, 0.15\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:39:11 compute-1 nova_compute[183751]: 2026-01-27 22:39:11.029 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:39:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:11.299 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:39:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:11.299 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:39:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:11.299 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:11 compute-1 nova_compute[183751]: 2026-01-27 22:39:11.538 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:39:12 compute-1 nova_compute[183751]: 2026-01-27 22:39:12.052 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:39:12 compute-1 nova_compute[183751]: 2026-01-27 22:39:12.053 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:39:13 compute-1 nova_compute[183751]: 2026-01-27 22:39:13.050 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:13 compute-1 nova_compute[183751]: 2026-01-27 22:39:13.051 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:13 compute-1 nova_compute[183751]: 2026-01-27 22:39:13.051 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:39:13 compute-1 nova_compute[183751]: 2026-01-27 22:39:13.131 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:13 compute-1 nova_compute[183751]: 2026-01-27 22:39:13.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:13 compute-1 podman[220813]: 2026-01-27 22:39:13.747506192 +0000 UTC m=+0.051811451 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 27 22:39:14 compute-1 nova_compute[183751]: 2026-01-27 22:39:14.720 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:15 compute-1 nova_compute[183751]: 2026-01-27 22:39:15.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:18 compute-1 nova_compute[183751]: 2026-01-27 22:39:18.133 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:18 compute-1 nova_compute[183751]: 2026-01-27 22:39:18.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:39:19 compute-1 openstack_network_exporter[195945]: ERROR   22:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:39:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:39:19 compute-1 openstack_network_exporter[195945]: ERROR   22:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:39:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:39:19 compute-1 nova_compute[183751]: 2026-01-27 22:39:19.722 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:21 compute-1 nova_compute[183751]: 2026-01-27 22:39:21.473 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:23 compute-1 nova_compute[183751]: 2026-01-27 22:39:23.134 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:24 compute-1 nova_compute[183751]: 2026-01-27 22:39:24.724 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:28 compute-1 nova_compute[183751]: 2026-01-27 22:39:28.138 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:29 compute-1 nova_compute[183751]: 2026-01-27 22:39:29.726 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:30 compute-1 podman[220837]: 2026-01-27 22:39:30.827759493 +0000 UTC m=+0.135978459 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 27 22:39:33 compute-1 nova_compute[183751]: 2026-01-27 22:39:33.178 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:34 compute-1 nova_compute[183751]: 2026-01-27 22:39:34.728 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:34 compute-1 podman[220863]: 2026-01-27 22:39:34.801716062 +0000 UTC m=+0.101906358 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Jan 27 22:39:34 compute-1 podman[220864]: 2026-01-27 22:39:34.810984181 +0000 UTC m=+0.092797913 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260126)
Jan 27 22:39:34 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:34.827 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:39:34 compute-1 nova_compute[183751]: 2026-01-27 22:39:34.828 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:34 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:34.829 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:39:35 compute-1 podman[193064]: time="2026-01-27T22:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:39:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:39:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Jan 27 22:39:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:35.797 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:ce:ec 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bb8ae35-374a-402a-86a2-14918e05f958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef8b4fcea1b9482fbcc882f3383af9f4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba2e925d-a604-49fc-8d28-9e325ef478cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b1c3a04c-9b21-40e0-b1e1-f0bdc41a2ba5) old=Port_Binding(mac=['fa:16:3e:e5:ce:ec'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bb8ae35-374a-402a-86a2-14918e05f958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef8b4fcea1b9482fbcc882f3383af9f4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:39:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:35.798 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b1c3a04c-9b21-40e0-b1e1-f0bdc41a2ba5 in datapath 3bb8ae35-374a-402a-86a2-14918e05f958 updated
Jan 27 22:39:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:35.799 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3bb8ae35-374a-402a-86a2-14918e05f958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:39:35 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:35.800 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[e50a23db-d63c-4077-8c2e-c2a260be431b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:38 compute-1 nova_compute[183751]: 2026-01-27 22:39:38.180 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:39 compute-1 nova_compute[183751]: 2026-01-27 22:39:39.730 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:43 compute-1 nova_compute[183751]: 2026-01-27 22:39:43.183 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:44 compute-1 nova_compute[183751]: 2026-01-27 22:39:44.731 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:44 compute-1 podman[220901]: 2026-01-27 22:39:44.78851733 +0000 UTC m=+0.093316536 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:39:44 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:44.830 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:39:46 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:46.098 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:29:7a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e5fd3a21-193e-4268-b513-8f30d8b4b443', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5fd3a21-193e-4268-b513-8f30d8b4b443', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6857f0c4294a43aab72cea9f0842f4c8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e9f85da-7b07-47d1-8617-60ab9cd5664c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d968bcdc-34f5-438e-8786-73aeb12de17f) old=Port_Binding(mac=['fa:16:3e:00:29:7a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e5fd3a21-193e-4268-b513-8f30d8b4b443', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5fd3a21-193e-4268-b513-8f30d8b4b443', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6857f0c4294a43aab72cea9f0842f4c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:39:46 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:46.099 105247 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d968bcdc-34f5-438e-8786-73aeb12de17f in datapath e5fd3a21-193e-4268-b513-8f30d8b4b443 updated
Jan 27 22:39:46 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:46.100 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e5fd3a21-193e-4268-b513-8f30d8b4b443, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:39:46 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:39:46.100 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6d85fcbd-d565-47b4-82d2-2585b05695c5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:39:48 compute-1 nova_compute[183751]: 2026-01-27 22:39:48.186 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:49 compute-1 openstack_network_exporter[195945]: ERROR   22:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:39:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:39:49 compute-1 openstack_network_exporter[195945]: ERROR   22:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:39:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:39:49 compute-1 nova_compute[183751]: 2026-01-27 22:39:49.731 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:53 compute-1 nova_compute[183751]: 2026-01-27 22:39:53.204 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:54 compute-1 nova_compute[183751]: 2026-01-27 22:39:54.733 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:56 compute-1 ovn_controller[95969]: 2026-01-27T22:39:56Z|00124|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 27 22:39:58 compute-1 nova_compute[183751]: 2026-01-27 22:39:58.205 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:39:59 compute-1 nova_compute[183751]: 2026-01-27 22:39:59.736 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:00 compute-1 nova_compute[183751]: 2026-01-27 22:40:00.543 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:00 compute-1 nova_compute[183751]: 2026-01-27 22:40:00.544 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:01 compute-1 nova_compute[183751]: 2026-01-27 22:40:01.050 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Jan 27 22:40:01 compute-1 nova_compute[183751]: 2026-01-27 22:40:01.622 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:01 compute-1 nova_compute[183751]: 2026-01-27 22:40:01.623 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:01 compute-1 nova_compute[183751]: 2026-01-27 22:40:01.634 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Jan 27 22:40:01 compute-1 nova_compute[183751]: 2026-01-27 22:40:01.635 183755 INFO nova.compute.claims [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Claim successful on node compute-1.ctlplane.example.com
Jan 27 22:40:01 compute-1 podman[220927]: 2026-01-27 22:40:01.911331223 +0000 UTC m=+0.207877575 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 27 22:40:02 compute-1 nova_compute[183751]: 2026-01-27 22:40:02.717 183755 DEBUG nova.compute.provider_tree [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:40:03 compute-1 nova_compute[183751]: 2026-01-27 22:40:03.207 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:03 compute-1 nova_compute[183751]: 2026-01-27 22:40:03.234 183755 DEBUG nova.scheduler.client.report [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:40:03 compute-1 nova_compute[183751]: 2026-01-27 22:40:03.748 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:03 compute-1 nova_compute[183751]: 2026-01-27 22:40:03.749 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Jan 27 22:40:04 compute-1 nova_compute[183751]: 2026-01-27 22:40:04.262 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Jan 27 22:40:04 compute-1 nova_compute[183751]: 2026-01-27 22:40:04.262 183755 DEBUG nova.network.neutron [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Jan 27 22:40:04 compute-1 nova_compute[183751]: 2026-01-27 22:40:04.263 183755 WARNING neutronclient.v2_0.client [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:40:04 compute-1 nova_compute[183751]: 2026-01-27 22:40:04.264 183755 WARNING neutronclient.v2_0.client [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:40:04 compute-1 nova_compute[183751]: 2026-01-27 22:40:04.739 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:04 compute-1 nova_compute[183751]: 2026-01-27 22:40:04.772 183755 INFO nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 27 22:40:04 compute-1 nova_compute[183751]: 2026-01-27 22:40:04.848 183755 DEBUG nova.network.neutron [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Successfully created port: 429754cf-05d3-4257-8e96-2548d7d594e7 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Jan 27 22:40:05 compute-1 nova_compute[183751]: 2026-01-27 22:40:05.279 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Jan 27 22:40:05 compute-1 podman[193064]: time="2026-01-27T22:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:40:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:40:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2183 "" "Go-http-client/1.1"
Jan 27 22:40:05 compute-1 podman[220954]: 2026-01-27 22:40:05.752340518 +0000 UTC m=+0.057440780 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260126, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 27 22:40:05 compute-1 podman[220953]: 2026-01-27 22:40:05.78074659 +0000 UTC m=+0.082919719 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.302 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.303 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.304 183755 INFO nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Creating image(s)
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.304 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "/var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.305 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "/var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.305 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "/var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.306 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.309 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.312 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.397 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.399 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.399 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.400 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.404 183755 DEBUG oslo_utils.imageutils.format_inspector [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.404 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.472 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.474 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.514 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd,backing_fmt=raw /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.515 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "9cf64b3615b46c7f4af3198185a658f62d1b2bcd" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.516 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.538 183755 DEBUG nova.network.neutron [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Successfully updated port: 429754cf-05d3-4257-8e96-2548d7d594e7 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.593 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9cf64b3615b46c7f4af3198185a658f62d1b2bcd --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.594 183755 DEBUG nova.virt.disk.api [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Checking if we can resize image /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.594 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.609 183755 DEBUG nova.compute.manager [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-changed-429754cf-05d3-4257-8e96-2548d7d594e7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.610 183755 DEBUG nova.compute.manager [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Refreshing instance network info cache due to event network-changed-429754cf-05d3-4257-8e96-2548d7d594e7. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.610 183755 DEBUG oslo_concurrency.lockutils [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "refresh_cache-199b6173-7807-45a0-9d03-8d9e1945a3ca" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.610 183755 DEBUG oslo_concurrency.lockutils [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquired lock "refresh_cache-199b6173-7807-45a0-9d03-8d9e1945a3ca" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.610 183755 DEBUG nova.network.neutron [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Refreshing network info cache for port 429754cf-05d3-4257-8e96-2548d7d594e7 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.650 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.651 183755 DEBUG nova.virt.disk.api [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Cannot resize image /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.651 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.652 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Ensure instance console log exists: /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.653 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.653 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:06 compute-1 nova_compute[183751]: 2026-01-27 22:40:06.653 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:07 compute-1 nova_compute[183751]: 2026-01-27 22:40:07.046 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "refresh_cache-199b6173-7807-45a0-9d03-8d9e1945a3ca" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Jan 27 22:40:07 compute-1 nova_compute[183751]: 2026-01-27 22:40:07.117 183755 WARNING neutronclient.v2_0.client [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:40:08 compute-1 nova_compute[183751]: 2026-01-27 22:40:08.013 183755 DEBUG nova.network.neutron [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:40:08 compute-1 nova_compute[183751]: 2026-01-27 22:40:08.210 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:08 compute-1 nova_compute[183751]: 2026-01-27 22:40:08.245 183755 DEBUG nova.network.neutron [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:40:08 compute-1 nova_compute[183751]: 2026-01-27 22:40:08.753 183755 DEBUG oslo_concurrency.lockutils [req-c205a882-b604-4077-8613-fffe8e2f1590 req-efa421c5-e2df-4fb2-9b20-b19b29f7b004 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Releasing lock "refresh_cache-199b6173-7807-45a0-9d03-8d9e1945a3ca" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:40:08 compute-1 nova_compute[183751]: 2026-01-27 22:40:08.755 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquired lock "refresh_cache-199b6173-7807-45a0-9d03-8d9e1945a3ca" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Jan 27 22:40:08 compute-1 nova_compute[183751]: 2026-01-27 22:40:08.756 183755 DEBUG nova.network.neutron [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Jan 27 22:40:09 compute-1 nova_compute[183751]: 2026-01-27 22:40:09.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:09 compute-1 nova_compute[183751]: 2026-01-27 22:40:09.742 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:10 compute-1 nova_compute[183751]: 2026-01-27 22:40:10.039 183755 DEBUG nova.network.neutron [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Jan 27 22:40:10 compute-1 nova_compute[183751]: 2026-01-27 22:40:10.143 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:10 compute-1 nova_compute[183751]: 2026-01-27 22:40:10.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:10 compute-1 nova_compute[183751]: 2026-01-27 22:40:10.241 183755 WARNING neutronclient.v2_0.client [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:40:10 compute-1 nova_compute[183751]: 2026-01-27 22:40:10.736 183755 DEBUG nova.network.neutron [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Updating instance_info_cache with network_info: [{"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.244 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Releasing lock "refresh_cache-199b6173-7807-45a0-9d03-8d9e1945a3ca" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.245 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Instance network_info: |[{"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.249 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Start _get_guest_xml network_info=[{"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_options': None, 'encrypted': False, 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'device_type': 'disk', 'image_id': '46eb297a-0b7d-41f9-8336-a7ae35b5797e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.254 183755 WARNING nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.257 183755 DEBUG nova.virt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1423487030', uuid='199b6173-7807-45a0-9d03-8d9e1945a3ca'), owner=OwnerMeta(userid='b54ab6cd3d74475e9b38cfa8f4f224bf', username='tempest-TestExecuteZoneMigrationStrategyVolume-756468494-project-admin', projectid='6857f0c4294a43aab72cea9f0842f4c8', projectname='tempest-TestExecuteZoneMigrationStrategyVolume-756468494'), image=ImageMeta(id='46eb297a-0b7d-41f9-8336-a7ae35b5797e', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1769553611.2570105) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.264 183755 DEBUG nova.virt.libvirt.host [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.265 183755 DEBUG nova.virt.libvirt.host [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.269 183755 DEBUG nova.virt.libvirt.host [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.269 183755 DEBUG nova.virt.libvirt.host [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.272 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.272 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-27T22:07:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='4ed8ebce-d846-4d13-950a-4fb8c3a02942',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-27T22:07:14Z,direct_url=<?>,disk_format='qcow2',id=46eb297a-0b7d-41f9-8336-a7ae35b5797e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='f70ec523177247bdb6ca1b7e476d21b7',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-27T22:07:16Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.273 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.273 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.274 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.274 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.275 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.275 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.275 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.276 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.276 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.277 183755 DEBUG nova.virt.hardware [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.284 183755 DEBUG nova.virt.libvirt.vif [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:39:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1423487030',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-142348703',id=27,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6857f0c4294a43aab72cea9f0842f4c8',ramdisk_id='',reservation_id='r-te8pm4wy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-756468494',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-756468494-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:40:05Z,user_data=None,user_id='b54ab6cd3d74475e9b38cfa8f4f224bf',uuid=199b6173-7807-45a0-9d03-8d9e1945a3ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.284 183755 DEBUG nova.network.os_vif_util [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Converting VIF {"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.285 183755 DEBUG nova.network.os_vif_util [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:e4:55,bridge_name='br-int',has_traffic_filtering=True,id=429754cf-05d3-4257-8e96-2548d7d594e7,network=Network(3bb8ae35-374a-402a-86a2-14918e05f958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap429754cf-05') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.287 183755 DEBUG nova.objects.instance [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 199b6173-7807-45a0-9d03-8d9e1945a3ca obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:40:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:11.301 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:11.301 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:11.302 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.663 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.664 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.664 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.797 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] End _get_guest_xml xml=<domain type="kvm">
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <uuid>199b6173-7807-45a0-9d03-8d9e1945a3ca</uuid>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <name>instance-0000001b</name>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <memory>131072</memory>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <vcpu>1</vcpu>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <metadata>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:name>tempest-TestExecuteZoneMigrationStrategyVolume-server-1423487030</nova:name>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:creationTime>2026-01-27 22:40:11</nova:creationTime>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:flavor name="m1.nano" id="4ed8ebce-d846-4d13-950a-4fb8c3a02942">
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:memory>128</nova:memory>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:disk>1</nova:disk>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:swap>0</nova:swap>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:ephemeral>0</nova:ephemeral>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:vcpus>1</nova:vcpus>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:extraSpecs>
Jan 27 22:40:11 compute-1 nova_compute[183751]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         </nova:extraSpecs>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       </nova:flavor>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:image uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e">
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:containerFormat>bare</nova:containerFormat>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:diskFormat>qcow2</nova:diskFormat>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:minDisk>1</nova:minDisk>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:minRam>0</nova:minRam>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:properties>
Jan 27 22:40:11 compute-1 nova_compute[183751]:           <nova:property name="hw_rng_model">virtio</nova:property>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         </nova:properties>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       </nova:image>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:owner>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:user uuid="b54ab6cd3d74475e9b38cfa8f4f224bf">tempest-TestExecuteZoneMigrationStrategyVolume-756468494-project-admin</nova:user>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:project uuid="6857f0c4294a43aab72cea9f0842f4c8">tempest-TestExecuteZoneMigrationStrategyVolume-756468494</nova:project>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       </nova:owner>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:root type="image" uuid="46eb297a-0b7d-41f9-8336-a7ae35b5797e"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <nova:ports>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         <nova:port uuid="429754cf-05d3-4257-8e96-2548d7d594e7">
Jan 27 22:40:11 compute-1 nova_compute[183751]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:         </nova:port>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       </nova:ports>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </nova:instance>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   </metadata>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <sysinfo type="smbios">
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <system>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <entry name="manufacturer">RDO</entry>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <entry name="product">OpenStack Compute</entry>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <entry name="serial">199b6173-7807-45a0-9d03-8d9e1945a3ca</entry>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <entry name="uuid">199b6173-7807-45a0-9d03-8d9e1945a3ca</entry>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <entry name="family">Virtual Machine</entry>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </system>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   </sysinfo>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <os>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <boot dev="hd"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <smbios mode="sysinfo"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   </os>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <features>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <acpi/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <apic/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <vmcoreinfo/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   </features>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <clock offset="utc">
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <timer name="pit" tickpolicy="delay"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <timer name="hpet" present="no"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   </clock>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <cpu mode="custom" match="exact">
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <model>Nehalem</model>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <topology sockets="1" cores="1" threads="1"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   </cpu>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   <devices>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <disk type="file" device="disk">
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <target dev="vda" bus="virtio"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <disk type="file" device="cdrom">
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <driver name="qemu" type="raw" cache="none"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <source file="/var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk.config"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <target dev="sda" bus="sata"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </disk>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <interface type="ethernet">
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <mac address="fa:16:3e:04:e4:55"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <driver name="vhost" rx_queue_size="512"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <mtu size="1442"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <target dev="tap429754cf-05"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </interface>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <serial type="pty">
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <log file="/var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/console.log" append="off"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </serial>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <video>
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <model type="virtio"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </video>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <input type="tablet" bus="usb"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <rng model="virtio">
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <backend model="random">/dev/urandom</backend>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </rng>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="pci" model="pcie-root-port"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <controller type="usb" index="0"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Jan 27 22:40:11 compute-1 nova_compute[183751]:       <stats period="10"/>
Jan 27 22:40:11 compute-1 nova_compute[183751]:     </memballoon>
Jan 27 22:40:11 compute-1 nova_compute[183751]:   </devices>
Jan 27 22:40:11 compute-1 nova_compute[183751]: </domain>
Jan 27 22:40:11 compute-1 nova_compute[183751]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.800 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Preparing to wait for external event network-vif-plugged-429754cf-05d3-4257-8e96-2548d7d594e7 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.801 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.802 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.802 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.804 183755 DEBUG nova.virt.libvirt.vif [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2026-01-27T22:39:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1423487030',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-142348703',id=27,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6857f0c4294a43aab72cea9f0842f4c8',ramdisk_id='',reservation_id='r-te8pm4wy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-756468494',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-756468494-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-27T22:40:05Z,user_data=None,user_id='b54ab6cd3d74475e9b38cfa8f4f224bf',uuid=199b6173-7807-45a0-9d03-8d9e1945a3ca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.805 183755 DEBUG nova.network.os_vif_util [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Converting VIF {"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.806 183755 DEBUG nova.network.os_vif_util [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:e4:55,bridge_name='br-int',has_traffic_filtering=True,id=429754cf-05d3-4257-8e96-2548d7d594e7,network=Network(3bb8ae35-374a-402a-86a2-14918e05f958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap429754cf-05') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.807 183755 DEBUG os_vif [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:e4:55,bridge_name='br-int',has_traffic_filtering=True,id=429754cf-05d3-4257-8e96-2548d7d594e7,network=Network(3bb8ae35-374a-402a-86a2-14918e05f958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap429754cf-05') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.809 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.809 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.810 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.812 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.813 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f76c0bac-a0df-59c7-9081-1cbbd293d4a2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.815 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.817 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.823 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.823 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap429754cf-05, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.824 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap429754cf-05, col_values=(('qos', UUID('ec8af993-f77b-4d43-83b1-9c02bf9acc59')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.825 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap429754cf-05, col_values=(('external_ids', {'iface-id': '429754cf-05d3-4257-8e96-2548d7d594e7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:e4:55', 'vm-uuid': '199b6173-7807-45a0-9d03-8d9e1945a3ca'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.827 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:11 compute-1 NetworkManager[56069]: <info>  [1769553611.8291] manager: (tap429754cf-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.830 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.834 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.835 183755 INFO os_vif [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:e4:55,bridge_name='br-int',has_traffic_filtering=True,id=429754cf-05d3-4257-8e96-2548d7d594e7,network=Network(3bb8ae35-374a-402a-86a2-14918e05f958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap429754cf-05')
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.943 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.945 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.992 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.993 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5853MB free_disk=73.13673400878906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.993 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:11 compute-1 nova_compute[183751]: 2026-01-27 22:40:11.993 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.044 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Instance 199b6173-7807-45a0-9d03-8d9e1945a3ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.045 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.045 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:40:11 up  2:42,  0 user,  load average: 0.34, 0.22, 0.16\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_6857f0c4294a43aab72cea9f0842f4c8': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.147 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.213 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.393 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.394 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.394 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] No VIF found with MAC fa:16:3e:04:e4:55, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.395 183755 INFO nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Using config drive
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.654 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:40:13 compute-1 nova_compute[183751]: 2026-01-27 22:40:13.909 183755 WARNING neutronclient.v2_0.client [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.108 183755 INFO nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Creating config drive at /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk.config
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.113 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpnda5xcdi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.168 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.169 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.175s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.253 183755 DEBUG oslo_concurrency.processutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpnda5xcdi" returned: 0 in 0.140s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:14 compute-1 kernel: tap429754cf-05: entered promiscuous mode
Jan 27 22:40:14 compute-1 NetworkManager[56069]: <info>  [1769553614.3465] manager: (tap429754cf-05): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Jan 27 22:40:14 compute-1 ovn_controller[95969]: 2026-01-27T22:40:14Z|00125|binding|INFO|Claiming lport 429754cf-05d3-4257-8e96-2548d7d594e7 for this chassis.
Jan 27 22:40:14 compute-1 ovn_controller[95969]: 2026-01-27T22:40:14Z|00126|binding|INFO|429754cf-05d3-4257-8e96-2548d7d594e7: Claiming fa:16:3e:04:e4:55 10.100.0.13
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.348 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.361 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.371 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:e4:55 10.100.0.13'], port_security=['fa:16:3e:04:e4:55 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '199b6173-7807-45a0-9d03-8d9e1945a3ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bb8ae35-374a-402a-86a2-14918e05f958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6857f0c4294a43aab72cea9f0842f4c8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '92693aa1-5f0b-4890-88fb-d4b31acc895c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba2e925d-a604-49fc-8d28-9e325ef478cb, chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=429754cf-05d3-4257-8e96-2548d7d594e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.372 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 429754cf-05d3-4257-8e96-2548d7d594e7 in datapath 3bb8ae35-374a-402a-86a2-14918e05f958 bound to our chassis
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.375 105247 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3bb8ae35-374a-402a-86a2-14918e05f958
Jan 27 22:40:14 compute-1 systemd-udevd[221027]: Network interface NamePolicy= disabled on kernel command line.
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.398 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0960b0-da25-472f-bf07-eb166ad404cc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.399 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3bb8ae35-31 in ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.402 212869 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3bb8ae35-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.402 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[51d11500-d2d4-494d-b6fd-c8a843e40af5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.403 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1b0b2b-fc92-41c3-a168-99a2f8f2a176]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 NetworkManager[56069]: <info>  [1769553614.4051] device (tap429754cf-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 27 22:40:14 compute-1 systemd-machined[155034]: New machine qemu-10-instance-0000001b.
Jan 27 22:40:14 compute-1 NetworkManager[56069]: <info>  [1769553614.4064] device (tap429754cf-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.412 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:14 compute-1 ovn_controller[95969]: 2026-01-27T22:40:14Z|00127|binding|INFO|Setting lport 429754cf-05d3-4257-8e96-2548d7d594e7 ovn-installed in OVS
Jan 27 22:40:14 compute-1 ovn_controller[95969]: 2026-01-27T22:40:14Z|00128|binding|INFO|Setting lport 429754cf-05d3-4257-8e96-2548d7d594e7 up in Southbound
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.415 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:14 compute-1 systemd[1]: Started Virtual Machine qemu-10-instance-0000001b.
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.424 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[c8154b81-16f7-40d4-960b-20af5cbc5e59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.444 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[58d0eb20-66d1-499b-8ce5-2a998afb0db2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.491 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[42d7e8a4-35fc-4211-93b1-f7fd900809a0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 NetworkManager[56069]: <info>  [1769553614.4988] manager: (tap3bb8ae35-30): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.498 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5e802017-9c2c-4fd2-bec5-32c3c85884f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.545 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[7f75798c-700e-4a81-911c-d5134220f4a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.548 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[fabfd0b2-a75e-455e-b42c-139cfee8c574]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 NetworkManager[56069]: <info>  [1769553614.5800] device (tap3bb8ae35-30): carrier: link connected
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.590 215469 DEBUG oslo.privsep.daemon [-] privsep: reply[35bb61cc-c72c-48c2-827f-17cff1d4e62f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.617 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ce13d1d7-7e83-4bf1-9c33-751a3efbc03f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bb8ae35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ce:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 975684, 'reachable_time': 16820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221061, 'error': None, 'target': 'ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.640 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[26c01196-f4e7-4694-b3b3-55f2697db03e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee5:ceec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 975684, 'tstamp': 975684}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221062, 'error': None, 'target': 'ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.664 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[77686230-0bce-4eef-9bfe-1bdfa20b967e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3bb8ae35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ce:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 975684, 'reachable_time': 16820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221063, 'error': None, 'target': 'ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.716 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[5048829a-7629-4186-90e5-1bbd0f90b06d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.831 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[84d3dd54-298a-47d2-a018-1999dcb791ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.833 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bb8ae35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.833 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.834 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bb8ae35-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:14 compute-1 NetworkManager[56069]: <info>  [1769553614.8375] manager: (tap3bb8ae35-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 27 22:40:14 compute-1 kernel: tap3bb8ae35-30: entered promiscuous mode
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.836 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.840 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3bb8ae35-30, col_values=(('external_ids', {'iface-id': 'b1c3a04c-9b21-40e0-b1e1-f0bdc41a2ba5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:40:14 compute-1 ovn_controller[95969]: 2026-01-27T22:40:14Z|00129|binding|INFO|Releasing lport b1c3a04c-9b21-40e0-b1e1-f0bdc41a2ba5 from this chassis (sb_readonly=0)
Jan 27 22:40:14 compute-1 nova_compute[183751]: 2026-01-27 22:40:14.865 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.868 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff0522e-bbff-4635-ae0c-e8ef7c715ba0]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.869 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.869 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.870 105247 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 3bb8ae35-374a-402a-86a2-14918e05f958 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.870 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.871 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ec5ba6-b10d-4f2e-9063-de05a4fc8d67]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.871 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.872 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[141ffcee-8098-46b2-b3e0-aac61e2eab5a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.873 105247 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: global
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     log         /dev/log local0 debug
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     log-tag     haproxy-metadata-proxy-3bb8ae35-374a-402a-86a2-14918e05f958
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     user        root
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     group       root
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     maxconn     1024
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     pidfile     /var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     daemon
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: defaults
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     log global
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     mode http
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     option httplog
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     option dontlognull
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     option http-server-close
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     option forwardfor
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     retries                 3
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     timeout http-request    30s
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     timeout connect         30s
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     timeout client          32s
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     timeout server          32s
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     timeout http-keep-alive 30s
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: listen listener
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     bind 169.254.169.254:80
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     server metadata /var/lib/neutron/metadata_proxy
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:     http-request add-header X-OVN-Network-ID 3bb8ae35-374a-402a-86a2-14918e05f958
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Jan 27 22:40:14 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:40:14.874 105247 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958', 'env', 'PROCESS_TAG=haproxy-3bb8ae35-374a-402a-86a2-14918e05f958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3bb8ae35-374a-402a-86a2-14918e05f958.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.210 183755 DEBUG nova.compute.manager [req-18126710-508b-440b-a3c9-617d8c33e07f req-484376ad-8ca6-45fb-8812-b83d485dd519 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-vif-plugged-429754cf-05d3-4257-8e96-2548d7d594e7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.210 183755 DEBUG oslo_concurrency.lockutils [req-18126710-508b-440b-a3c9-617d8c33e07f req-484376ad-8ca6-45fb-8812-b83d485dd519 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.211 183755 DEBUG oslo_concurrency.lockutils [req-18126710-508b-440b-a3c9-617d8c33e07f req-484376ad-8ca6-45fb-8812-b83d485dd519 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.211 183755 DEBUG oslo_concurrency.lockutils [req-18126710-508b-440b-a3c9-617d8c33e07f req-484376ad-8ca6-45fb-8812-b83d485dd519 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.212 183755 DEBUG nova.compute.manager [req-18126710-508b-440b-a3c9-617d8c33e07f req-484376ad-8ca6-45fb-8812-b83d485dd519 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Processing event network-vif-plugged-429754cf-05d3-4257-8e96-2548d7d594e7 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.213 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.217 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.222 183755 INFO nova.virt.libvirt.driver [-] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Instance spawned successfully.
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.223 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Jan 27 22:40:15 compute-1 podman[221102]: 2026-01-27 22:40:15.338858691 +0000 UTC m=+0.052438787 container create 3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Jan 27 22:40:15 compute-1 systemd[1]: Started libpod-conmon-3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006.scope.
Jan 27 22:40:15 compute-1 systemd[1]: Started libcrun container.
Jan 27 22:40:15 compute-1 podman[221102]: 2026-01-27 22:40:15.309832454 +0000 UTC m=+0.023412600 image pull cac409e9f8a54678134b7335a1be1097c580cd10e08ab7acb278074c520a05f6 38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Jan 27 22:40:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4a86280f9c5b9e9d162d6ab14f4d000b7b1a09ad21897d3c0eb710e738083a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 27 22:40:15 compute-1 podman[221102]: 2026-01-27 22:40:15.426117896 +0000 UTC m=+0.139697992 container init 3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 27 22:40:15 compute-1 podman[221115]: 2026-01-27 22:40:15.437171449 +0000 UTC m=+0.065939590 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 27 22:40:15 compute-1 podman[221102]: 2026-01-27 22:40:15.446589062 +0000 UTC m=+0.160169158 container start 3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958, org.label-schema.build-date=20260126, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Jan 27 22:40:15 compute-1 neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958[221118]: [NOTICE]   (221143) : New worker (221146) forked
Jan 27 22:40:15 compute-1 neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958[221118]: [NOTICE]   (221143) : Loading success.
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.740 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.743 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.743 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.744 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.745 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:40:15 compute-1 nova_compute[183751]: 2026-01-27 22:40:15.746 183755 DEBUG nova.virt.libvirt.driver [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.169 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.170 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.170 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.170 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.257 183755 INFO nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Took 9.95 seconds to spawn the instance on the hypervisor.
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.257 183755 DEBUG nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.798 183755 INFO nova.compute.manager [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Took 15.24 seconds to build instance.
Jan 27 22:40:16 compute-1 nova_compute[183751]: 2026-01-27 22:40:16.829 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:17 compute-1 nova_compute[183751]: 2026-01-27 22:40:17.271 183755 DEBUG nova.compute.manager [req-fa384265-f3cf-4ab1-a641-2ad7767b035d req-86dc2bba-0e86-45a6-861c-81cb99e8ffef 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-vif-plugged-429754cf-05d3-4257-8e96-2548d7d594e7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:40:17 compute-1 nova_compute[183751]: 2026-01-27 22:40:17.271 183755 DEBUG oslo_concurrency.lockutils [req-fa384265-f3cf-4ab1-a641-2ad7767b035d req-86dc2bba-0e86-45a6-861c-81cb99e8ffef 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:17 compute-1 nova_compute[183751]: 2026-01-27 22:40:17.272 183755 DEBUG oslo_concurrency.lockutils [req-fa384265-f3cf-4ab1-a641-2ad7767b035d req-86dc2bba-0e86-45a6-861c-81cb99e8ffef 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:17 compute-1 nova_compute[183751]: 2026-01-27 22:40:17.272 183755 DEBUG oslo_concurrency.lockutils [req-fa384265-f3cf-4ab1-a641-2ad7767b035d req-86dc2bba-0e86-45a6-861c-81cb99e8ffef 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:17 compute-1 nova_compute[183751]: 2026-01-27 22:40:17.272 183755 DEBUG nova.compute.manager [req-fa384265-f3cf-4ab1-a641-2ad7767b035d req-86dc2bba-0e86-45a6-861c-81cb99e8ffef 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] No waiting events found dispatching network-vif-plugged-429754cf-05d3-4257-8e96-2548d7d594e7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:40:17 compute-1 nova_compute[183751]: 2026-01-27 22:40:17.272 183755 WARNING nova.compute.manager [req-fa384265-f3cf-4ab1-a641-2ad7767b035d req-86dc2bba-0e86-45a6-861c-81cb99e8ffef 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received unexpected event network-vif-plugged-429754cf-05d3-4257-8e96-2548d7d594e7 for instance with vm_state active and task_state None.
Jan 27 22:40:17 compute-1 nova_compute[183751]: 2026-01-27 22:40:17.304 183755 DEBUG oslo_concurrency.lockutils [None req-abfab2d2-c748-4d1c-9eaf-2848c7ab98bf b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.761s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:18 compute-1 nova_compute[183751]: 2026-01-27 22:40:18.254 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:19 compute-1 openstack_network_exporter[195945]: ERROR   22:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:40:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:40:19 compute-1 openstack_network_exporter[195945]: ERROR   22:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:40:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:40:20 compute-1 nova_compute[183751]: 2026-01-27 22:40:20.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:21 compute-1 nova_compute[183751]: 2026-01-27 22:40:21.867 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:23 compute-1 nova_compute[183751]: 2026-01-27 22:40:23.317 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:24 compute-1 nova_compute[183751]: 2026-01-27 22:40:24.673 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:24 compute-1 nova_compute[183751]: 2026-01-27 22:40:24.677 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:25 compute-1 nova_compute[183751]: 2026-01-27 22:40:25.189 183755 DEBUG nova.objects.instance [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lazy-loading 'flavor' on Instance uuid 199b6173-7807-45a0-9d03-8d9e1945a3ca obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:40:26 compute-1 nova_compute[183751]: 2026-01-27 22:40:26.204 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 1.527s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:26 compute-1 nova_compute[183751]: 2026-01-27 22:40:26.872 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:27 compute-1 nova_compute[183751]: 2026-01-27 22:40:27.564 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:27 compute-1 nova_compute[183751]: 2026-01-27 22:40:27.565 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:27 compute-1 nova_compute[183751]: 2026-01-27 22:40:27.565 183755 INFO nova.compute.manager [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Attaching volume d3859eb7-9382-438b-ae1c-020c78adf571 to /dev/vdb
Jan 27 22:40:27 compute-1 nova_compute[183751]: 2026-01-27 22:40:27.567 183755 DEBUG nova.objects.instance [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lazy-loading 'flavor' on Instance uuid 199b6173-7807-45a0-9d03-8d9e1945a3ca obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:40:27 compute-1 ovn_controller[95969]: 2026-01-27T22:40:27Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:e4:55 10.100.0.13
Jan 27 22:40:27 compute-1 ovn_controller[95969]: 2026-01-27T22:40:27Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:e4:55 10.100.0.13
Jan 27 22:40:28 compute-1 nova_compute[183751]: 2026-01-27 22:40:28.295 183755 DEBUG os_brick.utils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.101', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-1.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:177
Jan 27 22:40:28 compute-1 nova_compute[183751]: 2026-01-27 22:40:28.296 183755 INFO oslo.privsep.daemon [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp8js46_cx/privsep.sock']
Jan 27 22:40:28 compute-1 nova_compute[183751]: 2026-01-27 22:40:28.321 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.031 183755 INFO oslo.privsep.daemon [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Spawned new privsep daemon via rootwrap
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:28.892 221170 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:28.897 221170 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:28.899 221170 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/CAP_DAC_READ_SEARCH|CAP_SYS_ADMIN/none
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:28.899 221170 INFO oslo.privsep.daemon [-] privsep daemon running as pid 221170
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.035 221170 DEBUG oslo.privsep.daemon [-] privsep: reply[3b97c958-af07-4ba8-a8af-bb05867e2650]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.126 221170 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.138 221170 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.138 221170 DEBUG oslo.privsep.daemon [-] privsep: reply[300e4964-80b8-4980-8dc8-31260afe130a]: (4, ('InitiatorName=iqn.1994-05.com.redhat:643c297d4e7f', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.141 221170 DEBUG oslo.privsep.daemon [-] privsep: Exception during request[cb66d4d8-a654-4eb4-a215-659a063746a3]: [Errno 2] No such file or directory: '/dev/scini' _process_cmd /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:492
Jan 27 22:40:29 compute-1 nova_compute[183751]: Traceback (most recent call last):
Jan 27 22:40:29 compute-1 nova_compute[183751]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd
Jan 27 22:40:29 compute-1 nova_compute[183751]:     ret = func(*f_args, **f_kwargs)
Jan 27 22:40:29 compute-1 nova_compute[183751]:           ^^^^^^^^^^^^^^^^^^^^^^^^^
Jan 27 22:40:29 compute-1 nova_compute[183751]:   File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap
Jan 27 22:40:29 compute-1 nova_compute[183751]:     return func(*args, **kwargs)
Jan 27 22:40:29 compute-1 nova_compute[183751]:            ^^^^^^^^^^^^^^^^^^^^^
Jan 27 22:40:29 compute-1 nova_compute[183751]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid
Jan 27 22:40:29 compute-1 nova_compute[183751]:     with open_scini_device() as fd:
Jan 27 22:40:29 compute-1 nova_compute[183751]:          ^^^^^^^^^^^^^^^^^^^
Jan 27 22:40:29 compute-1 nova_compute[183751]:   File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__
Jan 27 22:40:29 compute-1 nova_compute[183751]:     return next(self.gen)
Jan 27 22:40:29 compute-1 nova_compute[183751]:            ^^^^^^^^^^^^^^
Jan 27 22:40:29 compute-1 nova_compute[183751]:   File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device
Jan 27 22:40:29 compute-1 nova_compute[183751]:     fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)
Jan 27 22:40:29 compute-1 nova_compute[183751]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Jan 27 22:40:29 compute-1 nova_compute[183751]: FileNotFoundError: [Errno 2] No such file or directory: '/dev/scini'
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.142 221170 DEBUG oslo.privsep.daemon [-] privsep: reply[cb66d4d8-a654-4eb4-a215-659a063746a3]: (5, 'builtins.FileNotFoundError', (2, 'No such file or directory'), 'Traceback (most recent call last):\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/daemon.py", line 489, in _process_cmd\n    ret = func(*f_args, **f_kwargs)\n          ^^^^^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py", line 270, in _wrap\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 57, in get_guid\n    with open_scini_device() as fd:\n         ^^^^^^^^^^^^^^^^^^^\n  File "/usr/lib64/python3.12/contextlib.py", line 137, in __enter__\n    return next(self.gen)\n           ^^^^^^^^^^^^^^\n  File "/usr/lib/python3.12/site-packages/os_brick/privileged/scaleio.py", line 40, in open_scini_device\n    fd = os.open(SCINI_DEVICE_PATH, os.O_RDWR)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\nFileNotFoundError: [Errno 2] No such file or directory: \'/dev/scini\'\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.143 183755 ERROR os_brick.initiator.connectors.scaleio [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Error querying sdc guid: [Errno 2] No such file or directory: FileNotFoundError: [Errno 2] No such file or directory
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.143 183755 INFO os_brick.initiator.connectors.scaleio [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Unable to find SDC guid: Error querying sdc guid: [Errno 2] No such file or directory
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.144 221170 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.157 221170 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.157 221170 DEBUG oslo.privsep.daemon [-] privsep: reply[1c30be46-e290-4d43-a9c0-0f92a2098b0e]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.159 221170 DEBUG oslo.privsep.daemon [-] privsep: reply[bea2cd05-a12d-4b3e-be39-dfc0a79928bb]: (4, '3b9a1f76-d315-49d8-90b4-a523eb6cf5fa') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.159 183755 DEBUG oslo_concurrency.processutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.187 183755 DEBUG oslo_concurrency.processutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.192 183755 DEBUG os_brick.initiator.connectors.lightos [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:132
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.204 183755 INFO os_brick.initiator.connectors.lightos [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Current host hostNQN nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d and IP(s) are ['38.102.83.110', '192.168.122.101', '172.19.0.101', '172.18.0.101', '172.17.0.101', 'fe80::e45a:acff:fe3d:c06b', 'fe80::fc16:3eff:fe04:e455', 'fe80::7408:49ff:fe7f:6c87'] 
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.205 183755 DEBUG os_brick.initiator.connectors.lightos [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:109
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.205 183755 DEBUG os_brick.initiator.connectors.lightos [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.12/site-packages/os_brick/initiator/connectors/lightos.py:112
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.206 183755 DEBUG os_brick.utils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] <== get_connector_properties: return (911ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.101', 'host': 'compute-1.ctlplane.example.com', 'multipath': True, 'enforce_multipath': True, 'initiator': 'iqn.1994-05.com.redhat:643c297d4e7f', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3b9a1f76-d315-49d8-90b4-a523eb6cf5fa', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': '', 'host_ips': ['38.102.83.110', '192.168.122.101', '172.19.0.101', '172.18.0.101', '172.17.0.101', 'fe80::e45a:acff:fe3d:c06b', 'fe80::fc16:3eff:fe04:e455', 'fe80::7408:49ff:fe7f:6c87']} trace_logging_wrapper /usr/lib/python3.12/site-packages/os_brick/utils.py:204
Jan 27 22:40:29 compute-1 nova_compute[183751]: 2026-01-27 22:40:29.207 183755 DEBUG nova.virt.block_device [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Updating existing volume attachment record: bb14323f-de2c-40f5-ac15-2cea04845103 _volume_attach /usr/lib/python3.12/site-packages/nova/virt/block_device.py:666
Jan 27 22:40:30 compute-1 nova_compute[183751]: 2026-01-27 22:40:30.961 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:30 compute-1 nova_compute[183751]: 2026-01-27 22:40:30.962 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:30 compute-1 nova_compute[183751]: 2026-01-27 22:40:30.962 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:30 compute-1 nova_compute[183751]: 2026-01-27 22:40:30.963 183755 DEBUG nova.virt.libvirt.volume.mount [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Got _HostMountState generation 0 get_state /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:91
Jan 27 22:40:30 compute-1 nova_compute[183751]: 2026-01-27 22:40:30.963 183755 DEBUG nova.virt.libvirt.volume.mount [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] _HostMountState.mount(fstype=nfs, export=172.18.0.100:/data/cinder_backend_1, vol_name=volume-d3859eb7-9382-438b-ae1c-020c78adf571, /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540, options=[]) generation 0 mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:288
Jan 27 22:40:30 compute-1 nova_compute[183751]: 2026-01-27 22:40:30.963 183755 DEBUG nova.virt.libvirt.volume.mount [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Mounting /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:301
Jan 27 22:40:31 compute-1 kernel: FS-Cache: Loaded
Jan 27 22:40:31 compute-1 kernel: Key type dns_resolver registered
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ====                        Guru Meditation                         ====
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ====                            Package                             ====
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: product = OpenStack Compute
Jan 27 22:40:31 compute-1 nova_compute[183751]: vendor = RDO
Jan 27 22:40:31 compute-1 nova_compute[183751]: version = 32.1.0-0.20251105112212.710ffbb.el10
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ====                            Threads                             ====
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139715730720448                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:214 in _native_thread
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `libvirt.virEventRunDefaultImpl()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/site-packages/libvirt.py:441 in virEventRunDefaultImpl
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `ret = libvirtmod.virEventRunDefaultImpl()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139715739113152                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139715747505856                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139715755898560                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716242413248                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716250805952                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716259198656                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716267591360                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716275984064                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716284376768                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716292769472                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716779284160                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716787676864                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716796069568                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716804462272                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716812854976                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716821247680                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139716829640384                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139717042927296                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139717051336384                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139717059745472                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:74 in tworker
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `msg = _reqq.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/queue.py:171 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.not_empty.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                  Thread #139717222203008                   ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:178 in _handler
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `cls.handle_signal(version, service_name, log_dir, None)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:217 in handle_signal
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `res = cls(version, frame).run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/guru_meditation_report.py:266 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return super().run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:76 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return "\n".join(str(sect) for sect in self.sections)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:76 in <genexpr>
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return "\n".join(str(sect) for sect in self.sections)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:101 in __str__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.view(self.generator())`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/report.py:130 in newgen
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `res = gen()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_reports/generators/threading.py:67 in __call__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `thread_id: tm.ThreadModel(thread_id, stack)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ====                         Green Threads                          ====
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/bin/nova-compute:8 in <module>
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `sys.exit(main())`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/cmd/compute.py:62 in main
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `service.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/service.py:335 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `_launcher.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:300 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `status, signo = self._wait_for_exit_or_signal()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:278 in _wait_for_exit_or_signal
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `super().wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:213 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.services.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:690 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.tg.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:368 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._wait_threads()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:343 in _wait_threads
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._perform_action_on_threads(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:270 in _perform_action_on_threads
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `action_func(x)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:344 in <lambda>
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `lambda x: x.wait(),`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/threadgroup.py:63 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.thread.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:232 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._exit_event.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:577 in poll
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.conn.consume(timeout=current_timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1477 in consume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.ensure(_consume,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1173 in ensure
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `ret, channel = autoretry_method()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/connection.py:556 in _ensured
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return fun(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/connection.py:639 in __call__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return fun(*args, channel=channels[0], **kwargs), channels[0]`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1162 in execute_method
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `method()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1464 in _consume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.connection.drain_events(timeout=poll_timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/connection.py:341 in drain_events
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.transport.drain_events(self.connection, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/transport/pyamqp.py:171 in drain_events
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return connection.drain_events(**kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/connection.py:526 in drain_events
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `while not self.blocking_read(timeout):`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/connection.py:531 in blocking_read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `frame = self.transport.read_frame()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/transport.py:294 in read_frame
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `frame_header = read(7, True)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/transport.py:574 in _read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `s = recv(n - len(rbuf))  # see note above`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:196 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._call_trampolining(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:169 in _call_trampolining
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `trampoline(self,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._heartbeat_exit_event.wait(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:655 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `signaled = self._cond.wait(timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:359 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `gotit = waiter.acquire(True, timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.get_hub().switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._heartbeat_exit_event.wait(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:655 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `signaled = self._cond.wait(timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:359 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `gotit = waiter.acquire(True, timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.get_hub().switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1393 in _heartbeat_thread_job
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._heartbeat_exit_event.wait(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:655 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `signaled = self._cond.wait(timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:359 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `gotit = waiter.acquire(True, timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:107 in acquire
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.get_hub().switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `for msg in reader:`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `buf = self.readsock.recv(4096)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._read_trampoline()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._trampoline(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `for msg in reader:`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `buf = self.readsock.recv(4096)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._read_trampoline()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._trampoline(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:154 in _reader_main
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `for msg in reader:`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:91 in __next__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `buf = self.readsock.recv(4096)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._read_trampoline()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._trampoline(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `for line in f:`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `data = self.read(up_to)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return _original_os.read(self._fileno, size)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.trampoline(fd, read=True)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `for line in f:`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `data = self.read(up_to)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return _original_os.read(self._fileno, size)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.trampoline(fd, read=True)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:267 in logger
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `for line in f:`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:105 in readinto
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `data = self.read(up_to)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return _original_os.read(self._fileno, size)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.trampoline(fd, read=True)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 kernel: NFS: Registering the id_resolver key type
Jan 27 22:40:31 compute-1 kernel: Key type id_resolver registered
Jan 27 22:40:31 compute-1 kernel: Key type id_legacy registered
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_utils/excutils.py:257 in wrapper
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return infunc(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/base.py:294 in _runner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `incoming = self._poll_style_listener.poll(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/base.py:42 in wrapper
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `message = func(in_self, timeout=watch.leftover(True))`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py:429 in poll
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.conn.consume(timeout=min(self._current_timeout, left))`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1477 in consume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.ensure(_consume,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1173 in ensure
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `ret, channel = autoretry_method()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/connection.py:556 in _ensured
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return fun(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/connection.py:639 in __call__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return fun(*args, channel=channels[0], **kwargs), channels[0]`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1162 in execute_method
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `method()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/_drivers/impl_rabbit.py:1464 in _consume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.connection.drain_events(timeout=poll_timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/connection.py:341 in drain_events
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.transport.drain_events(self.connection, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/kombu/transport/pyamqp.py:171 in drain_events
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return connection.drain_events(**kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/connection.py:526 in drain_events
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `while not self.blocking_read(timeout):`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/connection.py:531 in blocking_read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `frame = self.transport.read_frame()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/transport.py:294 in read_frame
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `frame_header = read(7, True)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/amqp/transport.py:574 in _read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `s = recv(n - len(rbuf))  # see note above`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:196 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._call_trampolining(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/ssl.py:169 in _call_trampolining
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `trampoline(self,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:48 in __thread_body
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1032 in _bootstrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/thread.py:100 in wrap_bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bootstrap_inner()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1075 in _bootstrap_inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:1012 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._target(*self._args, **self._kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/connection.py:108 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.poller.block()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/site-packages/ovs/poller.py:231 in block
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `events = self.poll.poll(self.timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/site-packages/ovs/poller.py:137 in poll
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `rlist, wlist, xlist = select.select(self.rlist,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/select.py:80 in select
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.work.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = self.fn(*self.args, **self.kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/utils.py:584 in context_wrapper
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:225 in _dispatch_thread
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._dispatch_events()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:393 in _dispatch_events
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `_c = self._event_notify_recv.read(1)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/py3.py:84 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return _original_os.read(self._fileno, size)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/green/os.py:47 in read
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.trampoline(fd, read=True)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.work.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = self.fn(*self.args, **self.kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/utils.py:584 in context_wrapper
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:233 in _conn_event_thread
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._dispatch_conn_event()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:239 in _dispatch_conn_event
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `handler = self._conn_event_handler_queue.get()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/queue.py:321 in get
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return waiter.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/queue.py:140 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return get_hub().switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenpool.py:87 in _spawn_n_impl
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `func(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/futurist/_green.py:69 in __call__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.work.run()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/futurist/_utils.py:45 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = self.fn(*self.args, **self.kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py:174 in _process_incoming
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `res = self.dispatcher.dispatch(message)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py:309 in dispatch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._do_dispatch(endpoint, method, ctxt, args)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py:229 in _do_dispatch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = func(ctxt, **new_args)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/exception_wrapper.py:63 in wrapped
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return f(self, context, *args, **kw)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/compute/utils.py:1483 in decorated_function
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return function(self, context, *args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:203 in decorated_function
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return function(self, context, *args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8098 in attach_volume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `do_attach_volume(context, instance, driver_bdm)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:415 in inner
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return f(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8093 in do_attach_volume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._attach_volume(context, instance, driver_bdm)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/compute/manager.py:8112 in _attach_volume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `bdm.attach(context, instance, self.volume_api, self.driver,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:46 in wrapped
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `ret_val = method(obj, context, *args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:769 in attach
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._do_attach(context, instance, volume, volume_api,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:754 in _do_attach
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._volume_attach(context, volume, connector, instance,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/block_device.py:692 in _volume_attach
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `virt_driver.attach_volume(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2293 in attach_volume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._connect_volume(context, connection_info, instance,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2041 in _connect_volume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `vol_driver.connect_volume(connection_info, instance)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/fs.py:113 in connect_volume
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `mount.mount(self.fstype, export, vol_name, mountpoint, instance,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:414 in mount
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `mount_state.mount(fstype, export, vol_name, mountpoint, instance,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:308 in mount
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `nova.privsep.fs.mount(fstype, export, mountpoint, options)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/priv_context.py:267 in _wrap
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.channel.remote_call(name, args, kwargs,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:213 in remote_call
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = self.send_recv((comm.Message.CALL.value, name, args, kwargs),`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:194 in send_recv
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `reply = future.result()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_privsep/comm.py:121 in result
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `if not self.condvar.wait(timeout=self.timeout):`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib64/python3.12/threading.py:355 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `waiter.acquire()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/semaphore.py:115 in acquire
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `hubs.get_hub().switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = function(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:161 in _run_loop
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._sleep(idle)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:109 in _sleep
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._abort.wait(timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_utils/eventletutils.py:178 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `event.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = function(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:161 in _run_loop
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._sleep(idle)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:109 in _sleep
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._abort.wait(timeout)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_utils/eventletutils.py:178 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `event.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenthread.py:272 in main
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = function(*args, **kwargs)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:725 in run_service
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `done.wait()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/event.py:124 in wait
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `result = hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:352 in run
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self.fire_timers(self.clock())`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:471 in fire_timers
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `timer()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/timer.py:59 in __call__
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `cb(*args, **kw)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/tpool.py:56 in tpool_trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `_c = _rsock.recv(1)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:352 in recv
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self._recv_loop(self.fd.recv, b'', bufsize, flags)`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:346 in _recv_loop
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._read_trampoline()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:314 in _read_trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `self._trampoline(`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/greenio/base.py:206 in _trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return trampoline(fd, read=read, write=write, timeout=timeout,`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/__init__.py:157 in trampoline
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return hub.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: /usr/lib/python3.12/site-packages/eventlet/hubs/hub.py:310 in switch
Jan 27 22:40:31 compute-1 nova_compute[183751]:     `return self.greenlet.switch()`
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: No Traceback!
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ------                        Green Thread                        ------
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: No Traceback!
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ====                           Processes                            ====
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: Process 183755 (under 183753) [ run by: nova (42436), state: running ]
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: ====                         Configuration                          ====
Jan 27 22:40:31 compute-1 nova_compute[183751]: ========================================================================
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: api: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   compute_link_prefix = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01
Jan 27 22:40:31 compute-1 nova_compute[183751]:   dhcp_domain = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_instance_password = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   glance_link_prefix = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_list_cells_batch_fixed_size = 100
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_list_cells_batch_strategy = distributed
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_list_per_project_cells = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   list_records_by_skipping_down_cells = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   local_metadata_per_cell = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_limit = 1000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metadata_cache_expiration = 15
Jan 27 22:40:31 compute-1 nova_compute[183751]:   neutron_default_project_id = default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   response_validation = warn
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_neutron_default_nets = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vendordata_dynamic_connect_timeout = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vendordata_dynamic_failure_fatal = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vendordata_dynamic_read_timeout = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vendordata_dynamic_ssl_certfile = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vendordata_dynamic_targets = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vendordata_jsonfile_path = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vendordata_providers = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     StaticJSON
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: api_database: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   asyncio_connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   asyncio_slave_connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backend = sqlalchemy
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_debug = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_parameters = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_recycle_time = 3600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_trace = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_inc_retry_interval = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_max_retries = 20
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_max_retry_interval = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_retry_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_overflow = 50
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_pool_size = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_retries = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   mysql_sql_mode = TRADITIONAL
Jan 27 22:40:31 compute-1 nova_compute[183751]:   mysql_wsrep_sync_wait = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pool_timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retry_interval = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   slave_connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   sqlite_synchronous = True
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: barbican: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_endpoint = http://localhost/identity/v3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   barbican_api_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   barbican_endpoint = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   barbican_endpoint_type = internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:   barbican_region_name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   number_of_retries = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retry_delay = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   send_service_user_token = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   verify_ssl = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   verify_ssl_path = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: barbican_service_user: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: cache: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backend = oslo_cache.dict
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backend_argument = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backend_expiration_time = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   config_prefix = cache.oslo
Jan 27 22:40:31 compute-1 nova_compute[183751]:   dead_timeout = 60.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   debug_cache_backend = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_retry_client = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_socket_keepalive = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enforce_fips_mode = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   expiration_time = 600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   hashclient_retry_attempts = 2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   hashclient_retry_delay = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_dead_retry = 300
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_password = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_pool_connection_get_timeout = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_pool_flush_on_reconnect = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_pool_maxsize = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_pool_unused_timeout = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_sasl_enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_servers = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     localhost:11211
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_socket_timeout = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   memcache_username = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   proxies = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   redis_db = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   redis_password = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   redis_sentinel_service_name = mymaster
Jan 27 22:40:31 compute-1 nova_compute[183751]:   redis_sentinels = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     localhost:26379
Jan 27 22:40:31 compute-1 nova_compute[183751]:   redis_server = localhost:6379
Jan 27 22:40:31 compute-1 nova_compute[183751]:   redis_socket_timeout = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   redis_username = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retry_attempts = 2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retry_delay = 0.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   socket_keepalive_count = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   socket_keepalive_idle = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   socket_keepalive_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tls_allowed_ciphers = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tls_cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tls_certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tls_enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tls_keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: cinder: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = password
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   catalog_info = volumev3:cinderv3:internalURL
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cross_az_attach = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   debug = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint_template = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   http_retries = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   os_region_name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: compute: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   consecutive_build_service_disable_threshold = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_dedicated_set = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_shared_set = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   image_type_exclude_list = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_wait_for_vif_plug = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_concurrent_disk_ops = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_disk_devices_to_attach = -1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   packing_host_numa_cells_allocation_strategy = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   provider_config_location = /etc/nova/provider_config/
Jan 27 22:40:31 compute-1 nova_compute[183751]:   resource_provider_association_refresh = 300
Jan 27 22:40:31 compute-1 nova_compute[183751]:   sharing_providers_max_uuids_per_request = 200
Jan 27 22:40:31 compute-1 nova_compute[183751]:   shutdown_retry_interval = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vmdk_allowed_types = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     monolithicSparse
Jan 27 22:40:31 compute-1 nova_compute[183751]:     streamOptimized
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: conductor: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   workers = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: console: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   allowed_origins = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_ciphers = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_minimum_version = default
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: consoleauth: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enforce_session_timeout = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   token_ttl = 600
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: cyborg: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = accelerator
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:     public
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: database: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   asyncio_connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   asyncio_slave_connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backend = sqlalchemy
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_debug = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_parameters = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_recycle_time = 3600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_trace = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_inc_retry_interval = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_max_retries = 20
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_max_retry_interval = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   db_retry_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_overflow = 50
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_pool_size = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_retries = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   mysql_sql_mode = TRADITIONAL
Jan 27 22:40:31 compute-1 nova_compute[183751]:   mysql_wsrep_sync_wait = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pool_timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retry_interval = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   slave_connection = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   sqlite_synchronous = True
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: default: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   allow_resize_to_same_host = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   arq_binding_timeout = 300
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backdoor_port = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backdoor_socket = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   block_device_allocate_retries = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   block_device_allocate_retries_interval = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cell_worker_thread_pool_size = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cert = self.pem
Jan 27 22:40:31 compute-1 nova_compute[183751]:   compute_driver = libvirt.LibvirtDriver
Jan 27 22:40:31 compute-1 nova_compute[183751]:   compute_monitors = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   config-dir = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     /etc/nova/nova.conf.d
Jan 27 22:40:31 compute-1 nova_compute[183751]:   config-file = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     /etc/nova/nova-compute.conf
Jan 27 22:40:31 compute-1 nova_compute[183751]:     /etc/nova/nova.conf
Jan 27 22:40:31 compute-1 nova_compute[183751]:   config_drive_format = iso9660
Jan 27 22:40:31 compute-1 nova_compute[183751]:   config_source = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   console_host = compute-1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   control_exchange = nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_allocation_ratio = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   daemon = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   debug = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_access_ip_network_name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_availability_zone = nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_ephemeral_format = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_green_pool_size = 1000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_log_levels = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     amqp=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     amqplib=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     boto=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     dogpile.core.dogpile=INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:     glanceclient=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     iso8601=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     keystoneauth=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     keystonemiddleware=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     oslo.cache=INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:     oslo.messaging=INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:     oslo.privsep.daemon=INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:     oslo_messaging=INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:     oslo_policy=INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:     qpid=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     requests.packages.urllib3.connectionpool=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     requests.packages.urllib3.util.retry=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     routes.middleware=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     sqlalchemy=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     stevedore=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     suds=INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:     taskflow=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     urllib3.connectionpool=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     urllib3.util.retry=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:     websocket=WARN
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_schedule_zone = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_thread_pool_size = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disk_allocation_ratio = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_new_services = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   executor_thread_pool_size = 64
Jan 27 22:40:31 compute-1 nova_compute[183751]:   fatal_deprecations = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   flat_injected = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   force_config_drive = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   force_raw_images = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   graceful_shutdown_timeout = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   heal_instance_info_cache_interval = -1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   host = compute-1.ctlplane.example.com
Jan 27 22:40:31 compute-1 nova_compute[183751]:   initial_cpu_allocation_ratio = 4.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   initial_disk_allocation_ratio = 0.9
Jan 27 22:40:31 compute-1 nova_compute[183751]:   initial_ram_allocation_ratio = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   injected_network_template = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_build_timeout = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_delete_interval = 300
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_format = [instance: %(uuid)s] 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_name_template = instance-%08x
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_usage_audit = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_usage_audit_period = month
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instance_uuid_format = [instance: %(uuid)s] 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instances_path = /var/lib/nova/instances
Jan 27 22:40:31 compute-1 nova_compute[183751]:   internal_service_availability_zone = internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:   key = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_retry_count = 30
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log-config-append = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log-date-format = %Y-%m-%d %H:%M:%S
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log-dir = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log-file = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_color = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_options = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_rotate_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_rotate_interval_type = days
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_rotation_type = size
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s
Jan 27 22:40:31 compute-1 nova_compute[183751]:   long_rpc_timeout = 1800
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_concurrent_builds = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_concurrent_live_migrations = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_concurrent_snapshots = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_local_block_devices = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_logfile_count = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_logfile_size_mb = 20
Jan 27 22:40:31 compute-1 nova_compute[183751]:   maximum_instance_delete_attempts = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   migrate_max_retries = -1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   mkisofs_cmd = /usr/bin/mkisofs
Jan 27 22:40:31 compute-1 nova_compute[183751]:   my_block_storage_ip = 192.168.122.101
Jan 27 22:40:31 compute-1 nova_compute[183751]:   my_ip = 192.168.122.101
Jan 27 22:40:31 compute-1 nova_compute[183751]:   my_shared_fs_storage_ip = 192.168.122.101
Jan 27 22:40:31 compute-1 nova_compute[183751]:   network_allocate_retries = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   non_inheritable_image_properties = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     bittorrent
Jan 27 22:40:31 compute-1 nova_compute[183751]:     cache_in_nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   osapi_compute_unique_server_name_scope = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   password_length = 12
Jan 27 22:40:31 compute-1 nova_compute[183751]:   periodic_enable = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   periodic_fuzzy_delay = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pointer_model = usbtablet
Jan 27 22:40:31 compute-1 nova_compute[183751]:   preallocate_images = none
Jan 27 22:40:31 compute-1 nova_compute[183751]:   publish_errors = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pybasedir = /usr/lib/python3.12/site-packages
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ram_allocation_ratio = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rate_limit_burst = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rate_limit_except_level = CRITICAL
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rate_limit_interval = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reboot_timeout = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reclaim_instance_interval = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   record = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reimage_timeout_per_gb = 20
Jan 27 22:40:31 compute-1 nova_compute[183751]:   report_interval = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rescue_timeout = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reserved_host_cpus = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reserved_host_disk_mb = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reserved_host_memory_mb = 512
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reserved_huge_pages = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   resize_confirm_window = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   resize_fs_using_block_device = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   resume_guests_state_on_host_boot = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rootwrap_config = /etc/nova/rootwrap.conf
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rpc_ping_enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rpc_response_timeout = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   run_external_periodic_tasks = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   running_deleted_instance_action = reap
Jan 27 22:40:31 compute-1 nova_compute[183751]:   running_deleted_instance_poll_interval = 1800
Jan 27 22:40:31 compute-1 nova_compute[183751]:   running_deleted_instance_timeout = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   scheduler_instance_sync_interval = 120
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service_down_time = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   servicegroup_driver = db
Jan 27 22:40:31 compute-1 nova_compute[183751]:   shell_completion = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   shelved_offload_time = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   shelved_poll_interval = 3600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   shutdown_timeout = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   source_is_ipv6 = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_only = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   state_path = /var/lib/nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   sync_power_state_interval = 600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   sync_power_state_pool_size = 1000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   syslog-log-facility = LOG_USER
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tempdir = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   thread_pool_statistic_period = -1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout_nbd = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   transport_url = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   update_resources_interval = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use-journal = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use-json = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use-syslog = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_cow_images = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_rootwrap_daemon = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_stderr = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vcpu_pin_set = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vif_plugging_is_fatal = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vif_plugging_timeout = 300
Jan 27 22:40:31 compute-1 nova_compute[183751]:   virt_mkfs = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   volume_usage_poll_interval = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   watch-log-file = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   web = /usr/share/spice-html5
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: devices: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled_mdev_types = 
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ephemeral_storage_encryption: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cipher = aes-xts-plain64
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_format = luks
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   key_size = 512
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: filter_scheduler: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   aggregate_image_properties_isolation_namespace = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   aggregate_image_properties_isolation_separator = .
Jan 27 22:40:31 compute-1 nova_compute[183751]:   available_filters = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     nova.scheduler.filters.all_filters
Jan 27 22:40:31 compute-1 nova_compute[183751]:   build_failure_weight_multiplier = 1000000.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cross_cell_move_weight_multiplier = 1000000.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disk_weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled_filters = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     ComputeCapabilitiesFilter
Jan 27 22:40:31 compute-1 nova_compute[183751]:     ComputeFilter
Jan 27 22:40:31 compute-1 nova_compute[183751]:     ImagePropertiesFilter
Jan 27 22:40:31 compute-1 nova_compute[183751]:     ServerGroupAffinityFilter
Jan 27 22:40:31 compute-1 nova_compute[183751]:     ServerGroupAntiAffinityFilter
Jan 27 22:40:31 compute-1 nova_compute[183751]:   host_subset_size = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   hypervisor_version_weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   image_properties_default_architecture = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   image_props_weight_multiplier = 0.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   image_props_weight_setting = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   io_ops_weight_multiplier = -1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   isolated_hosts = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   isolated_images = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_instances_per_host = 50
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_io_ops_per_host = 8
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_instances_weight_multiplier = 0.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pci_in_placement = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pci_weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ram_weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   restrict_isolated_hosts_to_isolated_images = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   shuffle_best_same_weighed_hosts = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   soft_affinity_weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   soft_anti_affinity_weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   track_instance_changes = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   weight_classes = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     nova.scheduler.weights.all_weighers
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: glance: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   api_servers = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   debug = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_trusted_certificate_ids = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_certificate_validation = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_rbd_download = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_retries = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_ceph_conf = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_connect_timeout = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_pool = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_user = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = regionOne
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = image
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:   verify_glance_signatures = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: guestfs: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   debug = False
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: image_cache: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   manager_interval = 2400
Jan 27 22:40:31 compute-1 nova_compute[183751]:   precache_concurrency = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remove_unused_base_images = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remove_unused_original_minimum_age_seconds = 86400
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remove_unused_resized_minimum_age_seconds = 3600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   subdirectory_name = _base
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: ironic: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   api_max_retries = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   api_retry_interval = 2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   conductor_group = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   peer_list = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   serial_console_state_timeout = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = baremetal
Jan 27 22:40:31 compute-1 nova_compute[183751]:   shard = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:     public
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: key_manager: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   backend = barbican
Jan 27 22:40:31 compute-1 nova_compute[183751]:   fixed_key = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: keystone: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = identity
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:     public
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: libvirt: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ceph_mount_options = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ceph_mount_point_base = /var/lib/nova/mnt
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_uri = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_mode = custom
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_model_extra_flags = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_models = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     Nehalem
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_power_governor_high = performance
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_power_governor_low = powersave
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_power_management = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cpu_power_management_strategy = cpu_state
Jan 27 22:40:31 compute-1 nova_compute[183751]:   device_detach_attempts = 8
Jan 27 22:40:31 compute-1 nova_compute[183751]:   device_detach_timeout = 20
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disk_cachemodes = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disk_prefix = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled_perf_events = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   file_backed_memory = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   gid_maps = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   hw_disk_discard = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   hw_machine_type = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     x86_64=q35
Jan 27 22:40:31 compute-1 nova_compute[183751]:   images_rbd_ceph_conf = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   images_rbd_glance_copy_poll_interval = 15
Jan 27 22:40:31 compute-1 nova_compute[183751]:   images_rbd_glance_copy_timeout = 600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   images_rbd_glance_store_name = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   images_rbd_pool = rbd
Jan 27 22:40:31 compute-1 nova_compute[183751]:   images_type = qcow2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   images_volume_group = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   inject_key = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   inject_partition = -2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   inject_password = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   iscsi_iface = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   iser_use_multipath = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_bandwidth = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_completion_timeout = 800
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_downtime = 500
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_downtime_delay = 75
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_downtime_steps = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_inbound_addr = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_permit_auto_converge = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_permit_post_copy = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_scheme = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_timeout_action = force_complete
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_tunnelled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_uri = qemu+tls://%s/system
Jan 27 22:40:31 compute-1 nova_compute[183751]:   live_migration_with_native_tls = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_queues = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   mem_stats_period_seconds = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   migration_inbound_addr = 192.168.122.101
Jan 27 22:40:31 compute-1 nova_compute[183751]:   nfs_mount_options = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   nfs_mount_point_base = /var/lib/nova/mnt
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_aoe_discover_tries = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_iser_scan_tries = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_memory_encrypted_guests = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_nvme_discover_tries = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_pcie_ports = 24
Jan 27 22:40:31 compute-1 nova_compute[183751]:   num_volume_scan_tries = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pmem_namespaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   quobyte_client_cfg = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   quobyte_mount_point_base = /var/lib/nova/mnt
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_connect_timeout = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_destroy_volume_retries = 12
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_destroy_volume_retry_interval = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_secret_uuid = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rbd_user = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   realtime_scheduler_priority = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remote_filesystem_transport = ssh
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rescue_image_id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rescue_kernel_id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rescue_ramdisk_id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rng_dev_path = /dev/urandom
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rx_queue_size = 512
Jan 27 22:40:31 compute-1 nova_compute[183751]:   smbfs_mount_options = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   smbfs_mount_point_base = /var/lib/nova/mnt
Jan 27 22:40:31 compute-1 nova_compute[183751]:   snapshot_compression = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   snapshot_image_format = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   snapshots_directory = /var/lib/nova/instances/snapshots
Jan 27 22:40:31 compute-1 nova_compute[183751]:   sparse_logical_volumes = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   swtpm_enabled = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   swtpm_group = tss
Jan 27 22:40:31 compute-1 nova_compute[183751]:   swtpm_user = tss
Jan 27 22:40:31 compute-1 nova_compute[183751]:   sysinfo_serial = unique
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tb_cache_size = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   tx_queue_size = 512
Jan 27 22:40:31 compute-1 nova_compute[183751]:   uid_maps = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_virtio_for_bridges = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   virt_type = kvm
Jan 27 22:40:31 compute-1 nova_compute[183751]:   volume_clear = zero
Jan 27 22:40:31 compute-1 nova_compute[183751]:   volume_clear_size = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   volume_enforce_multipath = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   volume_use_multipath = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vzstorage_cache_path = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vzstorage_mount_group = qemu
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vzstorage_mount_opts = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vzstorage_mount_perms = 0770
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vzstorage_mount_point_base = /var/lib/nova/mnt
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vzstorage_mount_user = stack
Jan 27 22:40:31 compute-1 nova_compute[183751]:   wait_soft_reboot_seconds = 120
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: manila: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = shared-file-system
Jan 27 22:40:31 compute-1 nova_compute[183751]:   share_apply_policy_timeout = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:     public
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: metrics: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   required = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   weight_multiplier = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   weight_of_unavailable = -10000.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   weight_setting = 
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: mks: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   mksproxy_base_url = http://127.0.0.1:6090/
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: neutron: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = password
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_floating_pool = nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   extension_sync_interval = 600
Jan 27 22:40:31 compute-1 nova_compute[183751]:   http_retries = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metadata_proxy_shared_secret = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ovs_bridge = br-int
Jan 27 22:40:31 compute-1 nova_compute[183751]:   password = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   physnets = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-name = Default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-name = service
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = regionOne
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = network
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service_metadata_proxy = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   system-scope = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   trust-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-name = Default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   username = nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: neutron_tunnel: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   numa_nodes = 
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: notifications: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   bdms_in_notifications = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_level = INFO
Jan 27 22:40:31 compute-1 nova_compute[183751]:   include_share_mapping = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   notification_format = both
Jan 27 22:40:31 compute-1 nova_compute[183751]:   notify_on_state_change = vm_and_task_state
Jan 27 22:40:31 compute-1 nova_compute[183751]:   versioned_notifications_topics = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     versioned_notifications
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: nova_sys_admin: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   capabilities = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     0
Jan 27 22:40:31 compute-1 nova_compute[183751]:     1
Jan 27 22:40:31 compute-1 nova_compute[183751]:     12
Jan 27 22:40:31 compute-1 nova_compute[183751]:     2
Jan 27 22:40:31 compute-1 nova_compute[183751]:     21
Jan 27 22:40:31 compute-1 nova_compute[183751]:     3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   group = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   helper_command = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_daemon_traceback = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logger_name = oslo_privsep.daemon
Jan 27 22:40:31 compute-1 nova_compute[183751]:   thread_pool_size = 8
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: os_brick: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   lock_path = /var/lib/nova/tmp
Jan 27 22:40:31 compute-1 nova_compute[183751]:   wait_mpath_device_attempts = 4
Jan 27 22:40:31 compute-1 nova_compute[183751]:   wait_mpath_device_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: os_vif_linux_bridge: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   flat_interface = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   forward_bridge_interface = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     all
Jan 27 22:40:31 compute-1 nova_compute[183751]:   iptables_bottom_regex = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   iptables_drop_action = DROP
Jan 27 22:40:31 compute-1 nova_compute[183751]:   iptables_top_regex = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   network_device_mtu = 1500
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_ipv6 = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vlan_interface = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: os_vif_ovs: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default_qos_type = linux-noop
Jan 27 22:40:31 compute-1 nova_compute[183751]:   isolate_vif = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   network_device_mtu = 1500
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ovs_vsctl_timeout = 120
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ovsdb_connection = tcp:127.0.0.1:6640
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ovsdb_interface = native
Jan 27 22:40:31 compute-1 nova_compute[183751]:   per_port_bridge = False
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_concurrency: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disable_process_locking = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   lock_path = /var/lib/nova/tmp
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_limit: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = password
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint_id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint_interface = internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint_region_name = regionOne
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint_service_name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint_service_type = compute
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max-version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min-version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   password = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   system-scope = all
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   trust-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-name = Default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   username = nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_messaging_metrics: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metrics_buffer_size = 1000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metrics_enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metrics_process_name = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metrics_socket_file = /var/tmp/metrics_collector.sock
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metrics_thread_stop_timeout = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_messaging_notifications: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   driver = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     messagingv2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retry = -1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   topics = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     notifications
Jan 27 22:40:31 compute-1 nova_compute[183751]:   transport_url = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_messaging_rabbit: 
Jan 27 22:40:31 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 22:40:31 compute-1 nova_compute[183751]:   amqp_auto_delete = False
Jan 27 22:40:31 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 22:40:31 compute-1 nova_compute[183751]:   amqp_durable_queues = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   conn_pool_min_size = 2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   conn_pool_ttl = 1200
Jan 27 22:40:31 compute-1 nova_compute[183751]:   direct_mandatory_flag = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_cancel_on_failover = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   heartbeat_in_pthread = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   heartbeat_rate = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   heartbeat_timeout_threshold = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   hostname = compute-1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kombu_compression = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kombu_failover_strategy = round-robin
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kombu_missing_consumer_retry_timeout = 60
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kombu_reconnect_delay = 1.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kombu_reconnect_splay = 0.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   processname = nova-compute
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_ha_queues = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_interval_max = 30
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_login_method = AMQPLAIN
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_qos_prefetch_count = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_quorum_delivery_limit = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_quorum_max_memory_bytes = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_quorum_max_memory_length = 0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_quorum_queue = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_retry_backoff = 2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_retry_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_stream_fanout = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_transient_queues_ttl = 1800
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rabbit_transient_quorum_queue = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   rpc_conn_pool_size = 30
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_ca_file = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_cert_file = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_enforce_fips_mode = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_key_file = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_version = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_queue_manager = False
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_middleware: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   http_basic_auth_user_file = /etc/htpasswd
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_policy: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enforce_new_defaults = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enforce_scope = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   policy_default_rule = default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   policy_dirs = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     policy.d
Jan 27 22:40:31 compute-1 nova_compute[183751]:   policy_file = policy.yaml
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remote_content_type = application/x-www-form-urlencoded
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remote_ssl_ca_crt_file = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remote_ssl_client_crt_file = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remote_ssl_client_key_file = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remote_ssl_verify_server_crt = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   remote_timeout = 60.0
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_reports: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   file_event_handler = /var/lib/nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   file_event_handler_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_dir = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: oslo_versionedobjects: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   fatal_exception_format_errors = False
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: pci: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   alias = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   device_spec = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   report_in_placement = False
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: placement: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = password
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connect-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   endpoint-override = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   min_version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   password = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-name = Default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-name = service
Jan 27 22:40:31 compute-1 nova_compute[183751]:   region-name = regionOne
Jan 27 22:40:31 compute-1 nova_compute[183751]:   retriable-status-codes = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   service-type = placement
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retries = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   status-code-retry-delay = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   system-scope = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   trust-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-name = Default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   username = nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   valid-interfaces = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     internal
Jan 27 22:40:31 compute-1 nova_compute[183751]:   version = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: privsep_osbrick: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   capabilities = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     2
Jan 27 22:40:31 compute-1 nova_compute[183751]:     21
Jan 27 22:40:31 compute-1 nova_compute[183751]:   group = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   helper_command = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_daemon_traceback = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logger_name = os_brick.privileged
Jan 27 22:40:31 compute-1 nova_compute[183751]:   thread_pool_size = 8
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: quota: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cores = 20
Jan 27 22:40:31 compute-1 nova_compute[183751]:   count_usage_from_placement = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   driver = nova.quota.DbQuotaDriver
Jan 27 22:40:31 compute-1 nova_compute[183751]:   injected_file_content_bytes = 10240
Jan 27 22:40:31 compute-1 nova_compute[183751]:   injected_file_path_length = 255
Jan 27 22:40:31 compute-1 nova_compute[183751]:   injected_files = 5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   instances = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   key_pairs = 100
Jan 27 22:40:31 compute-1 nova_compute[183751]:   metadata_items = 128
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ram = 51200
Jan 27 22:40:31 compute-1 nova_compute[183751]:   recheck_quota = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   server_group_members = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   server_groups = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   unified_limits_resource_list = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     servers
Jan 27 22:40:31 compute-1 nova_compute[183751]:   unified_limits_resource_strategy = require
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nfsrahead[221196]: setting /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 readahead to 128
Jan 27 22:40:31 compute-1 nova_compute[183751]: scheduler: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   discover_hosts_in_cells_interval = -1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_isolated_aggregate_filtering = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   image_metadata_prefilter = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   limit_tenants_to_placement_aggregate = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_attempts = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   max_placement_results = 1000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   placement_aggregate_required_for_tenants = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   query_placement_for_image_type_support = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   query_placement_for_routed_network_aggregates = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   workers = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: serial_console: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   base_url = ws://127.0.0.1:6083/
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   port_range = 10000:20000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   proxyclient_address = 127.0.0.1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   serialproxy_host = 0.0.0.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   serialproxy_port = 6083
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: service_user: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth-url = https://keystone-internal.openstack.svc:5000
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = password
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   default-domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   domain-name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   password = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-domain-name = Default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   project-name = service
Jan 27 22:40:31 compute-1 nova_compute[183751]:   send_service_user_token = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   system-scope = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   trust-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-domain-name = Default
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user-id = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   username = nova
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: spice: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   agent_enabled = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html
Jan 27 22:40:31 compute-1 nova_compute[183751]:   html5proxy_host = 0.0.0.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   html5proxy_port = 6082
Jan 27 22:40:31 compute-1 nova_compute[183751]:   image_compression = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   jpeg_compression = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   playback_compression = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   require_secure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   server_listen = 127.0.0.1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   server_proxyclient_address = 127.0.0.1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   spice_direct_proxy_base_url = http://127.0.0.1:13002/nova
Jan 27 22:40:31 compute-1 nova_compute[183751]:   streaming_mode = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   zlib_compression = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: upgrade_levels: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   baseapi = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   compute = auto
Jan 27 22:40:31 compute-1 nova_compute[183751]:   conductor = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   scheduler = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: vault: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   approle_role_id = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   approle_secret_id = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kv_mountpoint = secret
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kv_path = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   kv_version = 2
Jan 27 22:40:31 compute-1 nova_compute[183751]:   namespace = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   root_token_id = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ssl_ca_crt_file = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = 60.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_ssl = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vault_url = http://127.0.0.1:8200
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: vendordata_dynamic_auth: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_section = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_type = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cafile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   certfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   collect-timing = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   keyfile = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   split-loggers = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   timeout = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: vif_plug_linux_bridge_privileged: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   capabilities = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     12
Jan 27 22:40:31 compute-1 nova_compute[183751]:   group = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   helper_command = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_daemon_traceback = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logger_name = oslo_privsep.daemon
Jan 27 22:40:31 compute-1 nova_compute[183751]:   thread_pool_size = 8
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: vif_plug_ovs_privileged: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   capabilities = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     1
Jan 27 22:40:31 compute-1 nova_compute[183751]:     12
Jan 27 22:40:31 compute-1 nova_compute[183751]:   group = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   helper_command = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   log_daemon_traceback = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   logger_name = oslo_privsep.daemon
Jan 27 22:40:31 compute-1 nova_compute[183751]:   thread_pool_size = 8
Jan 27 22:40:31 compute-1 nova_compute[183751]:   user = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: vmware: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   api_retry_count = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ca_file = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cache_prefix = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cluster_name = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   connection_pool_size = 10
Jan 27 22:40:31 compute-1 nova_compute[183751]:   console_delay_seconds = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   datastore_regex = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   host_ip = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   host_password = ***
Jan 27 22:40:31 compute-1 nova_compute[183751]:   host_port = 443
Jan 27 22:40:31 compute-1 nova_compute[183751]:   host_username = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   insecure = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   integration_bridge = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   maximum_objects = 100
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pbm_default_policy = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pbm_enabled = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   pbm_wsdl_location = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   serial_log_dir = /opt/vmware/vspc
Jan 27 22:40:31 compute-1 nova_compute[183751]:   serial_port_proxy_uri = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   serial_port_service_uri = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   task_poll_interval = 0.5
Jan 27 22:40:31 compute-1 nova_compute[183751]:   use_linked_clone = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vnc_keymap = en-us
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vnc_port = 5900
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vnc_port_total = 10000
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: vnc: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   auth_schemes = 
Jan 27 22:40:31 compute-1 nova_compute[183751]:     none
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enabled = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   novncproxy_base_url = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html
Jan 27 22:40:31 compute-1 nova_compute[183751]:   novncproxy_host = 0.0.0.0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   novncproxy_port = 6080
Jan 27 22:40:31 compute-1 nova_compute[183751]:   server_listen = ::0
Jan 27 22:40:31 compute-1 nova_compute[183751]:   server_proxyclient_address = 192.168.122.101
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vencrypt_ca_certs = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vencrypt_client_cert = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   vencrypt_client_key = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: workarounds: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disable_compute_service_check_for_ffu = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disable_deep_image_inspection = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disable_fallback_pcpu_query = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disable_group_policy_check_upcall = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disable_libvirt_livesnapshot = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   disable_rootwrap = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_numa_live_migration = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   enable_qemu_monitor_announce_self = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ensure_libvirt_rbd_instance_dir_cleanup = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   handle_virt_lifecycle_events = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   libvirt_disable_apic = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   never_download_image_if_on_rbd = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   qemu_monitor_announce_self_count = 3
Jan 27 22:40:31 compute-1 nova_compute[183751]:   qemu_monitor_announce_self_interval = 1
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reserve_disk_resource_for_image_cache = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   skip_cpu_compare_at_startup = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   skip_cpu_compare_on_dest = True
Jan 27 22:40:31 compute-1 nova_compute[183751]:   skip_hypervisor_version_check_on_lm = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   skip_reserve_in_use_ironic_nodes = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   unified_limits_count_pcpu_as_vcpu = False
Jan 27 22:40:31 compute-1 nova_compute[183751]:   wait_for_vif_plugged_event_during_hard_reboot = 
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: wsgi: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   api_paste_config = api-paste.ini
Jan 27 22:40:31 compute-1 nova_compute[183751]:   secure_proxy_ssl_header = None
Jan 27 22:40:31 compute-1 nova_compute[183751]: 
Jan 27 22:40:31 compute-1 nova_compute[183751]: zvm: 
Jan 27 22:40:31 compute-1 nova_compute[183751]:   ca_file = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   cloud_connector_url = None
Jan 27 22:40:31 compute-1 nova_compute[183751]:   image_tmp_path = /var/lib/nova/images
Jan 27 22:40:31 compute-1 nova_compute[183751]:   reachable_timeout = 300
Jan 27 22:40:31 compute-1 nova_compute[183751]: 2026-01-27 22:40:31.572 183755 DEBUG nova.virt.libvirt.volume.mount [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] _HostMountState.mount() for /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 completed successfully mount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:334
Jan 27 22:40:31 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 27 22:40:31 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 27 22:40:31 compute-1 nova_compute[183751]: 2026-01-27 22:40:31.645 183755 DEBUG nova.virt.libvirt.guest [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] attach device xml: <disk type="file" device="disk">
Jan 27 22:40:31 compute-1 nova_compute[183751]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Jan 27 22:40:31 compute-1 nova_compute[183751]:   <alias name="ua-d3859eb7-9382-438b-ae1c-020c78adf571"/>
Jan 27 22:40:31 compute-1 nova_compute[183751]:   <source file="/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540/volume-d3859eb7-9382-438b-ae1c-020c78adf571"/>
Jan 27 22:40:31 compute-1 nova_compute[183751]:   <target dev="vdb" bus="virtio"/>
Jan 27 22:40:31 compute-1 nova_compute[183751]:   <serial>d3859eb7-9382-438b-ae1c-020c78adf571</serial>
Jan 27 22:40:31 compute-1 nova_compute[183751]: </disk>
Jan 27 22:40:31 compute-1 nova_compute[183751]:  attach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:336
Jan 27 22:40:31 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 27 22:40:31 compute-1 nova_compute[183751]: 2026-01-27 22:40:31.875 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:32 compute-1 podman[221218]: 2026-01-27 22:40:32.825227556 +0000 UTC m=+0.126025754 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260126, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 27 22:40:33 compute-1 nova_compute[183751]: 2026-01-27 22:40:33.364 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:33 compute-1 nova_compute[183751]: 2026-01-27 22:40:33.416 183755 DEBUG nova.virt.libvirt.driver [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:40:33 compute-1 nova_compute[183751]: 2026-01-27 22:40:33.416 183755 DEBUG nova.virt.libvirt.driver [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:40:33 compute-1 nova_compute[183751]: 2026-01-27 22:40:33.416 183755 DEBUG nova.virt.libvirt.driver [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Jan 27 22:40:33 compute-1 nova_compute[183751]: 2026-01-27 22:40:33.417 183755 DEBUG nova.virt.libvirt.driver [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] No VIF found with MAC fa:16:3e:04:e4:55, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Jan 27 22:40:35 compute-1 nova_compute[183751]: 2026-01-27 22:40:35.167 183755 DEBUG oslo_concurrency.lockutils [None req-f08e538b-34cf-4644-8fa3-1ed3a97b3966 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 7.602s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:40:35 compute-1 podman[193064]: time="2026-01-27T22:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:40:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16575 "" "Go-http-client/1.1"
Jan 27 22:40:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:40:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2645 "" "Go-http-client/1.1"
Jan 27 22:40:36 compute-1 podman[221245]: 2026-01-27 22:40:36.815952211 +0000 UTC m=+0.110024889 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:40:36 compute-1 podman[221244]: 2026-01-27 22:40:36.820953554 +0000 UTC m=+0.121509232 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Jan 27 22:40:36 compute-1 nova_compute[183751]: 2026-01-27 22:40:36.881 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:37 compute-1 nova_compute[183751]: 2026-01-27 22:40:37.455 183755 DEBUG oslo_concurrency.lockutils [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:40:37 compute-1 nova_compute[183751]: 2026-01-27 22:40:37.456 183755 DEBUG oslo_concurrency.lockutils [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:40:37 compute-1 nova_compute[183751]: 2026-01-27 22:40:37.964 183755 DEBUG nova.objects.instance [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lazy-loading 'flavor' on Instance uuid 199b6173-7807-45a0-9d03-8d9e1945a3ca obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:40:38 compute-1 nova_compute[183751]: 2026-01-27 22:40:38.144 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:40:38 compute-1 nova_compute[183751]: 2026-01-27 22:40:38.368 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.153 183755 INFO nova.compute.manager [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Detaching volume d3859eb7-9382-438b-ae1c-020c78adf571
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.343 183755 INFO nova.virt.block_device [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Attempting to driver detach volume d3859eb7-9382-438b-ae1c-020c78adf571 from mountpoint /dev/vdb
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.356 183755 DEBUG nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Found disk vdb by alias ua-d3859eb7-9382-438b-ae1c-020c78adf571 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.360 183755 DEBUG nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Found disk vdb by alias ua-d3859eb7-9382-438b-ae1c-020c78adf571 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.360 183755 DEBUG nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Attempting to detach device vdb from instance 199b6173-7807-45a0-9d03-8d9e1945a3ca from the persistent domain config. _detach_from_persistent /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2576
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.361 183755 DEBUG nova.virt.libvirt.guest [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] detach device xml: <disk type="file" device="disk">
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <alias name="ua-d3859eb7-9382-438b-ae1c-020c78adf571"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <source file="/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540/volume-d3859eb7-9382-438b-ae1c-020c78adf571"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <target dev="vdb" bus="virtio"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <serial>d3859eb7-9382-438b-ae1c-020c78adf571</serial>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]: </disk>
Jan 27 22:40:39 compute-1 nova_compute[183751]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.372 183755 DEBUG nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Found disk vdb by alias ua-d3859eb7-9382-438b-ae1c-020c78adf571 _get_guest_disk_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2825
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.373 183755 WARNING nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Failed to detach device vdb from instance 199b6173-7807-45a0-9d03-8d9e1945a3ca from the persistent domain config. Libvirt did not report any error but the device is still in the config.
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.373 183755 DEBUG nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] (1/8): Attempting to detach device vdb with device alias ua-d3859eb7-9382-438b-ae1c-020c78adf571 from instance 199b6173-7807-45a0-9d03-8d9e1945a3ca from the live domain config. _detach_from_live_with_retry /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2612
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.374 183755 DEBUG nova.virt.libvirt.guest [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] detach device xml: <disk type="file" device="disk">
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <driver name="qemu" type="raw" cache="none" io="native"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <alias name="ua-d3859eb7-9382-438b-ae1c-020c78adf571"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <source file="/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540/volume-d3859eb7-9382-438b-ae1c-020c78adf571"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <target dev="vdb" bus="virtio"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <serial>d3859eb7-9382-438b-ae1c-020c78adf571</serial>
Jan 27 22:40:39 compute-1 nova_compute[183751]:   <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 27 22:40:39 compute-1 nova_compute[183751]: </disk>
Jan 27 22:40:39 compute-1 nova_compute[183751]:  detach_device /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:466
Jan 27 22:40:39 compute-1 nova_compute[183751]: 2026-01-27 22:40:39.525 183755 DEBUG nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias ua-d3859eb7-9382-438b-ae1c-020c78adf571 for instance 199b6173-7807-45a0-9d03-8d9e1945a3ca _detach_from_live_and_wait_for_event /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:2688
Jan 27 22:40:41 compute-1 nova_compute[183751]: 2026-01-27 22:40:41.885 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:43 compute-1 nova_compute[183751]: 2026-01-27 22:40:43.370 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:44 compute-1 ovn_controller[95969]: 2026-01-27T22:40:44Z|00130|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Jan 27 22:40:45 compute-1 podman[221286]: 2026-01-27 22:40:45.773543738 +0000 UTC m=+0.070798969 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:40:46 compute-1 nova_compute[183751]: 2026-01-27 22:40:46.888 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:48 compute-1 nova_compute[183751]: 2026-01-27 22:40:48.374 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:49 compute-1 openstack_network_exporter[195945]: ERROR   22:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:40:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:40:49 compute-1 openstack_network_exporter[195945]: ERROR   22:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:40:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:40:51 compute-1 nova_compute[183751]: 2026-01-27 22:40:51.892 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:53 compute-1 nova_compute[183751]: 2026-01-27 22:40:53.376 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:56 compute-1 nova_compute[183751]: 2026-01-27 22:40:56.894 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:58 compute-1 nova_compute[183751]: 2026-01-27 22:40:58.378 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:40:59 compute-1 nova_compute[183751]: 2026-01-27 22:40:59.527 183755 WARNING nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Waiting for libvirt event about the detach of device vdb with device alias ua-d3859eb7-9382-438b-ae1c-020c78adf571 from instance 199b6173-7807-45a0-9d03-8d9e1945a3ca is timed out.
Jan 27 22:40:59 compute-1 nova_compute[183751]: 2026-01-27 22:40:59.538 183755 INFO nova.virt.libvirt.driver [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Successfully detached device vdb from instance 199b6173-7807-45a0-9d03-8d9e1945a3ca from the live domain config.
Jan 27 22:40:59 compute-1 nova_compute[183751]: 2026-01-27 22:40:59.542 183755 DEBUG nova.virt.libvirt.volume.mount [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Got _HostMountState generation 0 get_state /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:91
Jan 27 22:40:59 compute-1 nova_compute[183751]: 2026-01-27 22:40:59.542 183755 DEBUG nova.virt.libvirt.volume.mount [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] _HostMountState.umount(vol_name=volume-d3859eb7-9382-438b-ae1c-020c78adf571, mountpoint=/var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540) generation 0 umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:349
Jan 27 22:40:59 compute-1 nova_compute[183751]: 2026-01-27 22:40:59.544 183755 DEBUG nova.virt.libvirt.volume.mount [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Unmounting /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 _real_umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:382
Jan 27 22:40:59 compute-1 systemd[1]: var-lib-nova-mnt-cec891824fc057f7ee63f2ed70041540.mount: Deactivated successfully.
Jan 27 22:40:59 compute-1 nova_compute[183751]: 2026-01-27 22:40:59.633 183755 DEBUG nova.virt.libvirt.volume.mount [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] _HostMountState.umount() for /var/lib/nova/mnt/cec891824fc057f7ee63f2ed70041540 generation 0 completed successfully umount /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:372
Jan 27 22:41:00 compute-1 nova_compute[183751]: 2026-01-27 22:41:00.908 183755 DEBUG oslo_concurrency.lockutils [None req-4c0a7feb-ae8c-46f3-b036-ef3568987290 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 23.452s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:01 compute-1 nova_compute[183751]: 2026-01-27 22:41:01.896 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:03 compute-1 nova_compute[183751]: 2026-01-27 22:41:03.381 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:03 compute-1 podman[221316]: 2026-01-27 22:41:03.842081992 +0000 UTC m=+0.144349626 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Jan 27 22:41:03 compute-1 nova_compute[183751]: 2026-01-27 22:41:03.896 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:03 compute-1 nova_compute[183751]: 2026-01-27 22:41:03.897 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:03 compute-1 nova_compute[183751]: 2026-01-27 22:41:03.897 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:03 compute-1 nova_compute[183751]: 2026-01-27 22:41:03.897 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:03 compute-1 nova_compute[183751]: 2026-01-27 22:41:03.898 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:03 compute-1 nova_compute[183751]: 2026-01-27 22:41:03.915 183755 INFO nova.compute.manager [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Terminating instance
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.436 183755 DEBUG nova.compute.manager [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Jan 27 22:41:04 compute-1 kernel: tap429754cf-05 (unregistering): left promiscuous mode
Jan 27 22:41:04 compute-1 NetworkManager[56069]: <info>  [1769553664.4674] device (tap429754cf-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 27 22:41:04 compute-1 ovn_controller[95969]: 2026-01-27T22:41:04Z|00131|binding|INFO|Releasing lport 429754cf-05d3-4257-8e96-2548d7d594e7 from this chassis (sb_readonly=0)
Jan 27 22:41:04 compute-1 ovn_controller[95969]: 2026-01-27T22:41:04Z|00132|binding|INFO|Setting lport 429754cf-05d3-4257-8e96-2548d7d594e7 down in Southbound
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.478 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:04 compute-1 ovn_controller[95969]: 2026-01-27T22:41:04Z|00133|binding|INFO|Removing iface tap429754cf-05 ovn-installed in OVS
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.482 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.499 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:e4:55 10.100.0.13'], port_security=['fa:16:3e:04:e4:55 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '199b6173-7807-45a0-9d03-8d9e1945a3ca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3bb8ae35-374a-402a-86a2-14918e05f958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6857f0c4294a43aab72cea9f0842f4c8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '92693aa1-5f0b-4890-88fb-d4b31acc895c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba2e925d-a604-49fc-8d28-9e325ef478cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>], logical_port=429754cf-05d3-4257-8e96-2548d7d594e7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f2f3c27e930>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.502 105247 INFO neutron.agent.ovn.metadata.agent [-] Port 429754cf-05d3-4257-8e96-2548d7d594e7 in datapath 3bb8ae35-374a-402a-86a2-14918e05f958 unbound from our chassis
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.503 105247 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3bb8ae35-374a-402a-86a2-14918e05f958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.509 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[37febafd-72fe-4cc6-9c43-00c962a9fe71]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.510 105247 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958 namespace which is not needed anymore
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.511 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:04 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Jan 27 22:41:04 compute-1 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001b.scope: Consumed 14.621s CPU time.
Jan 27 22:41:04 compute-1 systemd-machined[155034]: Machine qemu-10-instance-0000001b terminated.
Jan 27 22:41:04 compute-1 neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958[221118]: [NOTICE]   (221143) : haproxy version is 3.0.5-8e879a5
Jan 27 22:41:04 compute-1 neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958[221118]: [NOTICE]   (221143) : path to executable is /usr/sbin/haproxy
Jan 27 22:41:04 compute-1 neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958[221118]: [WARNING]  (221143) : Exiting Master process...
Jan 27 22:41:04 compute-1 podman[221367]: 2026-01-27 22:41:04.690209722 +0000 UTC m=+0.064599907 container kill 3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 27 22:41:04 compute-1 neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958[221118]: [ALERT]    (221143) : Current worker (221146) exited with code 143 (Terminated)
Jan 27 22:41:04 compute-1 neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958[221118]: [WARNING]  (221143) : All workers exited. Exiting... (0)
Jan 27 22:41:04 compute-1 systemd[1]: libpod-3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006.scope: Deactivated successfully.
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.731 183755 INFO nova.virt.libvirt.driver [-] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Instance destroyed successfully.
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.732 183755 DEBUG nova.objects.instance [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lazy-loading 'resources' on Instance uuid 199b6173-7807-45a0-9d03-8d9e1945a3ca obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Jan 27 22:41:04 compute-1 podman[221393]: 2026-01-27 22:41:04.765226635 +0000 UTC m=+0.047427333 container died 3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260126)
Jan 27 22:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006-userdata-shm.mount: Deactivated successfully.
Jan 27 22:41:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-e4a86280f9c5b9e9d162d6ab14f4d000b7b1a09ad21897d3c0eb710e738083a2-merged.mount: Deactivated successfully.
Jan 27 22:41:04 compute-1 podman[221393]: 2026-01-27 22:41:04.811240111 +0000 UTC m=+0.093440729 container cleanup 3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Jan 27 22:41:04 compute-1 systemd[1]: libpod-conmon-3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006.scope: Deactivated successfully.
Jan 27 22:41:04 compute-1 podman[221399]: 2026-01-27 22:41:04.835192653 +0000 UTC m=+0.094453504 container remove 3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006 (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.844 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[ee85d6ca-d2a8-48b1-8500-94db8900881a]: (4, ("Tue Jan 27 10:41:04 PM UTC 2026 Sending signal '15' to neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958 (3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006)\n3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006\nTue Jan 27 10:41:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958 (3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006)\n3dbb8e14c02e6c0f3ba60ccd2d8c624e3305ea1c1da5b2509d762bddbbecc006\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.846 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4f9b3d-a167-47e8-b690-77ef578b4f5e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.846 105247 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3bb8ae35-374a-402a-86a2-14918e05f958.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.847 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[e420baed-ec6f-46a4-b15f-a80ae6e50899]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.848 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bb8ae35-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.850 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:04 compute-1 kernel: tap3bb8ae35-30: left promiscuous mode
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.880 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.886 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[d10d2e66-6fe8-4f2c-81e6-267333d36301]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.911 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[83e8e0c2-a10b-40ae-b7e9-7dca24f5705e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.912 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6163e8-8a2e-46ad-926a-cb191077ed64]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.931 212869 DEBUG oslo.privsep.daemon [-] privsep: reply[c920b029-fec4-46f9-9e16-6a31e22908b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 975674, 'reachable_time': 15646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221434, 'error': None, 'target': 'ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.934 105687 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3bb8ae35-374a-402a-86a2-14918e05f958 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Jan 27 22:41:04 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:04.935 105687 DEBUG oslo.privsep.daemon [-] privsep: reply[8728fee2-88d3-4b04-a73b-b992155476f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Jan 27 22:41:04 compute-1 systemd[1]: run-netns-ovnmeta\x2d3bb8ae35\x2d374a\x2d402a\x2d86a2\x2d14918e05f958.mount: Deactivated successfully.
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.960 183755 DEBUG nova.compute.manager [req-35adfe5b-62a1-4c24-b956-70ee742232c8 req-4cdc10d2-8b11-4821-ad7d-7bed576cc84f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-vif-unplugged-429754cf-05d3-4257-8e96-2548d7d594e7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.962 183755 DEBUG oslo_concurrency.lockutils [req-35adfe5b-62a1-4c24-b956-70ee742232c8 req-4cdc10d2-8b11-4821-ad7d-7bed576cc84f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.962 183755 DEBUG oslo_concurrency.lockutils [req-35adfe5b-62a1-4c24-b956-70ee742232c8 req-4cdc10d2-8b11-4821-ad7d-7bed576cc84f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.963 183755 DEBUG oslo_concurrency.lockutils [req-35adfe5b-62a1-4c24-b956-70ee742232c8 req-4cdc10d2-8b11-4821-ad7d-7bed576cc84f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.963 183755 DEBUG nova.compute.manager [req-35adfe5b-62a1-4c24-b956-70ee742232c8 req-4cdc10d2-8b11-4821-ad7d-7bed576cc84f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] No waiting events found dispatching network-vif-unplugged-429754cf-05d3-4257-8e96-2548d7d594e7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:41:04 compute-1 nova_compute[183751]: 2026-01-27 22:41:04.963 183755 DEBUG nova.compute.manager [req-35adfe5b-62a1-4c24-b956-70ee742232c8 req-4cdc10d2-8b11-4821-ad7d-7bed576cc84f 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-vif-unplugged-429754cf-05d3-4257-8e96-2548d7d594e7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:41:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:05.142 105247 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '3e:b5:ec', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2a:1e:58:12:07:ea'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.142 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:05 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:05.143 105247 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.241 183755 DEBUG nova.virt.libvirt.vif [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2026-01-27T22:39:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategyVolume-server-1423487030',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategyvolume-server-142348703',id=27,image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-27T22:40:16Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6857f0c4294a43aab72cea9f0842f4c8',ramdisk_id='',reservation_id='r-te8pm4wy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,reader,member,admin',image_base_image_ref='46eb297a-0b7d-41f9-8336-a7ae35b5797e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategyVolume-756468494',owner_user_name='tempest-TestExecuteZoneMigrationStrategyVolume-756468494-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-27T22:40:16Z,user_data=None,user_id='b54ab6cd3d74475e9b38cfa8f4f224bf',uuid=199b6173-7807-45a0-9d03-8d9e1945a3ca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.242 183755 DEBUG nova.network.os_vif_util [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Converting VIF {"id": "429754cf-05d3-4257-8e96-2548d7d594e7", "address": "fa:16:3e:04:e4:55", "network": {"id": "3bb8ae35-374a-402a-86a2-14918e05f958", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategyVolume-438128632-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ef8b4fcea1b9482fbcc882f3383af9f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap429754cf-05", "ovs_interfaceid": "429754cf-05d3-4257-8e96-2548d7d594e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.243 183755 DEBUG nova.network.os_vif_util [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:e4:55,bridge_name='br-int',has_traffic_filtering=True,id=429754cf-05d3-4257-8e96-2548d7d594e7,network=Network(3bb8ae35-374a-402a-86a2-14918e05f958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap429754cf-05') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.243 183755 DEBUG os_vif [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:e4:55,bridge_name='br-int',has_traffic_filtering=True,id=429754cf-05d3-4257-8e96-2548d7d594e7,network=Network(3bb8ae35-374a-402a-86a2-14918e05f958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap429754cf-05') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.245 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.246 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap429754cf-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.247 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.249 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.250 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.250 183755 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ec8af993-f77b-4d43-83b1-9c02bf9acc59) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.251 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.252 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.255 183755 INFO os_vif [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:e4:55,bridge_name='br-int',has_traffic_filtering=True,id=429754cf-05d3-4257-8e96-2548d7d594e7,network=Network(3bb8ae35-374a-402a-86a2-14918e05f958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap429754cf-05')
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.256 183755 INFO nova.virt.libvirt.driver [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Deleting instance files /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca_del
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.257 183755 INFO nova.virt.libvirt.driver [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Deletion of /var/lib/nova/instances/199b6173-7807-45a0-9d03-8d9e1945a3ca_del complete
Jan 27 22:41:05 compute-1 podman[193064]: time="2026-01-27T22:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:41:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:41:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.777 183755 INFO nova.compute.manager [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Took 1.34 seconds to destroy the instance on the hypervisor.
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.777 183755 DEBUG oslo.service.backend._eventlet.loopingcall [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.778 183755 DEBUG nova.compute.manager [-] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.778 183755 DEBUG nova.network.neutron [-] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Jan 27 22:41:05 compute-1 nova_compute[183751]: 2026-01-27 22:41:05.779 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:41:06 compute-1 nova_compute[183751]: 2026-01-27 22:41:06.762 183755 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Jan 27 22:41:07 compute-1 nova_compute[183751]: 2026-01-27 22:41:07.271 183755 DEBUG nova.compute.manager [req-e413a41b-c1b4-4ca9-8d81-eb71de056437 req-cefaf81a-dde0-46c2-b93c-41cb1378aa16 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-vif-unplugged-429754cf-05d3-4257-8e96-2548d7d594e7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:41:07 compute-1 nova_compute[183751]: 2026-01-27 22:41:07.272 183755 DEBUG oslo_concurrency.lockutils [req-e413a41b-c1b4-4ca9-8d81-eb71de056437 req-cefaf81a-dde0-46c2-b93c-41cb1378aa16 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Acquiring lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:07 compute-1 nova_compute[183751]: 2026-01-27 22:41:07.272 183755 DEBUG oslo_concurrency.lockutils [req-e413a41b-c1b4-4ca9-8d81-eb71de056437 req-cefaf81a-dde0-46c2-b93c-41cb1378aa16 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:07 compute-1 nova_compute[183751]: 2026-01-27 22:41:07.273 183755 DEBUG oslo_concurrency.lockutils [req-e413a41b-c1b4-4ca9-8d81-eb71de056437 req-cefaf81a-dde0-46c2-b93c-41cb1378aa16 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:07 compute-1 nova_compute[183751]: 2026-01-27 22:41:07.273 183755 DEBUG nova.compute.manager [req-e413a41b-c1b4-4ca9-8d81-eb71de056437 req-cefaf81a-dde0-46c2-b93c-41cb1378aa16 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] No waiting events found dispatching network-vif-unplugged-429754cf-05d3-4257-8e96-2548d7d594e7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Jan 27 22:41:07 compute-1 nova_compute[183751]: 2026-01-27 22:41:07.274 183755 DEBUG nova.compute.manager [req-e413a41b-c1b4-4ca9-8d81-eb71de056437 req-cefaf81a-dde0-46c2-b93c-41cb1378aa16 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-vif-unplugged-429754cf-05d3-4257-8e96-2548d7d594e7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Jan 27 22:41:07 compute-1 podman[221437]: 2026-01-27 22:41:07.770326422 +0000 UTC m=+0.069633591 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 27 22:41:07 compute-1 podman[221436]: 2026-01-27 22:41:07.770326182 +0000 UTC m=+0.069671642 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 27 22:41:08 compute-1 nova_compute[183751]: 2026-01-27 22:41:08.274 183755 DEBUG nova.network.neutron [-] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Jan 27 22:41:08 compute-1 nova_compute[183751]: 2026-01-27 22:41:08.383 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:08 compute-1 nova_compute[183751]: 2026-01-27 22:41:08.782 183755 INFO nova.compute.manager [-] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Took 3.00 seconds to deallocate network for instance.
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.320 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.321 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.346 183755 DEBUG nova.compute.manager [req-28ef3d60-682f-4ff3-acf3-3b273b52314b req-6780682b-3174-4437-97fb-8c0cae1d75bd 67a9f5292f4648fa837488bb6620e8d0 9ac2d5bfa72f4a2491c98d8593d977eb - - default default] [instance: 199b6173-7807-45a0-9d03-8d9e1945a3ca] Received event network-vif-deleted-429754cf-05d3-4257-8e96-2548d7d594e7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.363 183755 DEBUG nova.scheduler.client.report [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Refreshing inventories for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.385 183755 DEBUG nova.scheduler.client.report [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Updating ProviderTree inventory for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.385 183755 DEBUG nova.compute.provider_tree [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Updating inventory in ProviderTree for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.452 183755 DEBUG nova.scheduler.client.report [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Refreshing aggregate associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.506 183755 DEBUG nova.scheduler.client.report [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Refreshing trait associations for resource provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05, traits: COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_TIS,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_ARCH_X86_64,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_ARCH_X86_64,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOUND_MODEL_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_CRB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOUND_MODEL_VIRTIO _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Jan 27 22:41:09 compute-1 nova_compute[183751]: 2026-01-27 22:41:09.550 183755 DEBUG nova.compute.provider_tree [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:41:10 compute-1 nova_compute[183751]: 2026-01-27 22:41:10.059 183755 DEBUG nova.scheduler.client.report [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:41:10 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:10.145 105247 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=af804609-b297-47b2-80af-51c874daa876, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 27 22:41:10 compute-1 nova_compute[183751]: 2026-01-27 22:41:10.252 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:10 compute-1 nova_compute[183751]: 2026-01-27 22:41:10.580 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.259s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:10 compute-1 nova_compute[183751]: 2026-01-27 22:41:10.611 183755 INFO nova.scheduler.client.report [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Deleted allocations for instance 199b6173-7807-45a0-9d03-8d9e1945a3ca
Jan 27 22:41:10 compute-1 systemd[1]: Starting dnf makecache...
Jan 27 22:41:10 compute-1 dnf[221474]: Repository 'gating-repo' is missing name in configuration, using id.
Jan 27 22:41:10 compute-1 dnf[221474]: Metadata cache refreshed recently.
Jan 27 22:41:10 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 27 22:41:10 compute-1 systemd[1]: Finished dnf makecache.
Jan 27 22:41:11 compute-1 nova_compute[183751]: 2026-01-27 22:41:11.142 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:11 compute-1 nova_compute[183751]: 2026-01-27 22:41:11.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:11.303 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:11.303 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:41:11.303 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:11 compute-1 nova_compute[183751]: 2026-01-27 22:41:11.646 183755 DEBUG oslo_concurrency.lockutils [None req-2fc6e4bf-0bd2-461e-8aa7-88a4caba4197 b54ab6cd3d74475e9b38cfa8f4f224bf 6857f0c4294a43aab72cea9f0842f4c8 - - default default] Lock "199b6173-7807-45a0-9d03-8d9e1945a3ca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.750s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:12 compute-1 nova_compute[183751]: 2026-01-27 22:41:12.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:13 compute-1 nova_compute[183751]: 2026-01-27 22:41:13.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:13 compute-1 nova_compute[183751]: 2026-01-27 22:41:13.385 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:13 compute-1 nova_compute[183751]: 2026-01-27 22:41:13.870 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:13 compute-1 nova_compute[183751]: 2026-01-27 22:41:13.871 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:13 compute-1 nova_compute[183751]: 2026-01-27 22:41:13.872 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:13 compute-1 nova_compute[183751]: 2026-01-27 22:41:13.872 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:41:14 compute-1 nova_compute[183751]: 2026-01-27 22:41:14.099 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:41:14 compute-1 nova_compute[183751]: 2026-01-27 22:41:14.102 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:41:14 compute-1 nova_compute[183751]: 2026-01-27 22:41:14.127 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:41:14 compute-1 nova_compute[183751]: 2026-01-27 22:41:14.128 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5755MB free_disk=73.13681030273438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:41:14 compute-1 nova_compute[183751]: 2026-01-27 22:41:14.128 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:41:14 compute-1 nova_compute[183751]: 2026-01-27 22:41:14.129 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:41:15 compute-1 nova_compute[183751]: 2026-01-27 22:41:15.187 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:41:15 compute-1 nova_compute[183751]: 2026-01-27 22:41:15.188 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:41:14 up  2:43,  0 user,  load average: 1.08, 0.43, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:41:15 compute-1 nova_compute[183751]: 2026-01-27 22:41:15.219 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:41:15 compute-1 nova_compute[183751]: 2026-01-27 22:41:15.254 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:15 compute-1 nova_compute[183751]: 2026-01-27 22:41:15.726 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:41:16 compute-1 nova_compute[183751]: 2026-01-27 22:41:16.234 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:41:16 compute-1 nova_compute[183751]: 2026-01-27 22:41:16.235 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:41:16 compute-1 podman[221478]: 2026-01-27 22:41:16.807557037 +0000 UTC m=+0.104446590 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:41:18 compute-1 nova_compute[183751]: 2026-01-27 22:41:18.235 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:18 compute-1 nova_compute[183751]: 2026-01-27 22:41:18.235 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:18 compute-1 nova_compute[183751]: 2026-01-27 22:41:18.236 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:18 compute-1 nova_compute[183751]: 2026-01-27 22:41:18.236 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:41:18 compute-1 nova_compute[183751]: 2026-01-27 22:41:18.388 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:19 compute-1 openstack_network_exporter[195945]: ERROR   22:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:41:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:41:19 compute-1 openstack_network_exporter[195945]: ERROR   22:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:41:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:41:20 compute-1 nova_compute[183751]: 2026-01-27 22:41:20.256 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:20 compute-1 nova_compute[183751]: 2026-01-27 22:41:20.496 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:21 compute-1 nova_compute[183751]: 2026-01-27 22:41:21.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:41:23 compute-1 nova_compute[183751]: 2026-01-27 22:41:23.417 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:25 compute-1 nova_compute[183751]: 2026-01-27 22:41:25.259 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:28 compute-1 nova_compute[183751]: 2026-01-27 22:41:28.419 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:30 compute-1 nova_compute[183751]: 2026-01-27 22:41:30.263 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:33 compute-1 nova_compute[183751]: 2026-01-27 22:41:33.450 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:34 compute-1 podman[221502]: 2026-01-27 22:41:34.819395361 +0000 UTC m=+0.129431578 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 27 22:41:35 compute-1 nova_compute[183751]: 2026-01-27 22:41:35.265 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:35 compute-1 podman[193064]: time="2026-01-27T22:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:41:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:41:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:41:38 compute-1 nova_compute[183751]: 2026-01-27 22:41:38.493 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:38 compute-1 podman[221529]: 2026-01-27 22:41:38.779708862 +0000 UTC m=+0.077570926 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Jan 27 22:41:38 compute-1 podman[221528]: 2026-01-27 22:41:38.791940175 +0000 UTC m=+0.090921227 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public)
Jan 27 22:41:40 compute-1 nova_compute[183751]: 2026-01-27 22:41:40.267 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:43 compute-1 nova_compute[183751]: 2026-01-27 22:41:43.494 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:45 compute-1 nova_compute[183751]: 2026-01-27 22:41:45.269 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:47 compute-1 podman[221568]: 2026-01-27 22:41:47.780819605 +0000 UTC m=+0.083065043 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 27 22:41:48 compute-1 nova_compute[183751]: 2026-01-27 22:41:48.495 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:49 compute-1 openstack_network_exporter[195945]: ERROR   22:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:41:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:41:49 compute-1 openstack_network_exporter[195945]: ERROR   22:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:41:49 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:41:50 compute-1 nova_compute[183751]: 2026-01-27 22:41:50.272 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:53 compute-1 nova_compute[183751]: 2026-01-27 22:41:53.547 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:55 compute-1 nova_compute[183751]: 2026-01-27 22:41:55.274 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:41:55 compute-1 ovn_controller[95969]: 2026-01-27T22:41:55Z|00134|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Jan 27 22:41:58 compute-1 nova_compute[183751]: 2026-01-27 22:41:58.548 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:00 compute-1 nova_compute[183751]: 2026-01-27 22:42:00.276 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:03 compute-1 nova_compute[183751]: 2026-01-27 22:42:03.550 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:05 compute-1 nova_compute[183751]: 2026-01-27 22:42:05.278 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:05 compute-1 podman[193064]: time="2026-01-27T22:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:42:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:42:05 compute-1 podman[193064]: @ - - [27/Jan/2026:22:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:42:05 compute-1 podman[221594]: 2026-01-27 22:42:05.837200129 +0000 UTC m=+0.139617460 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260126, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 27 22:42:08 compute-1 nova_compute[183751]: 2026-01-27 22:42:08.553 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:09 compute-1 nova_compute[183751]: 2026-01-27 22:42:09.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:09 compute-1 nova_compute[183751]: 2026-01-27 22:42:09.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Jan 27 22:42:09 compute-1 nova_compute[183751]: 2026-01-27 22:42:09.657 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Jan 27 22:42:09 compute-1 podman[221622]: 2026-01-27 22:42:09.78126434 +0000 UTC m=+0.075652470 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260126, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 27 22:42:09 compute-1 podman[221621]: 2026-01-27 22:42:09.793310087 +0000 UTC m=+0.094517775 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter)
Jan 27 22:42:10 compute-1 nova_compute[183751]: 2026-01-27 22:42:10.280 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:10 compute-1 nova_compute[183751]: 2026-01-27 22:42:10.652 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:42:11.304 105247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:42:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:42:11.305 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:42:11 compute-1 ovn_metadata_agent[105242]: 2026-01-27 22:42:11.305 105247 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.558 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.666 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.667 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.668 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.896 183755 WARNING nova.virt.libvirt.driver [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.898 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.922 183755 DEBUG oslo_concurrency.processutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.923 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5775MB free_disk=73.1368179321289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.923 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Jan 27 22:42:13 compute-1 nova_compute[183751]: 2026-01-27 22:42:13.924 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Jan 27 22:42:15 compute-1 nova_compute[183751]: 2026-01-27 22:42:15.005 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Jan 27 22:42:15 compute-1 nova_compute[183751]: 2026-01-27 22:42:15.006 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 22:42:13 up  2:44,  0 user,  load average: 0.60, 0.40, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Jan 27 22:42:15 compute-1 nova_compute[183751]: 2026-01-27 22:42:15.044 183755 DEBUG nova.compute.provider_tree [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed in ProviderTree for provider: 18406e9c-09cc-4d76-bc69-d3d1c0683e05 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Jan 27 22:42:15 compute-1 nova_compute[183751]: 2026-01-27 22:42:15.283 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:15 compute-1 nova_compute[183751]: 2026-01-27 22:42:15.560 183755 DEBUG nova.scheduler.client.report [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Inventory has not changed for provider 18406e9c-09cc-4d76-bc69-d3d1c0683e05 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Jan 27 22:42:16 compute-1 nova_compute[183751]: 2026-01-27 22:42:16.074 183755 DEBUG nova.compute.resource_tracker [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Jan 27 22:42:16 compute-1 nova_compute[183751]: 2026-01-27 22:42:16.075 183755 DEBUG oslo_concurrency.lockutils [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Jan 27 22:42:17 compute-1 nova_compute[183751]: 2026-01-27 22:42:17.074 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:17 compute-1 nova_compute[183751]: 2026-01-27 22:42:17.075 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:17 compute-1 nova_compute[183751]: 2026-01-27 22:42:17.147 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:17 compute-1 nova_compute[183751]: 2026-01-27 22:42:17.148 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Jan 27 22:42:18 compute-1 nova_compute[183751]: 2026-01-27 22:42:18.562 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:18 compute-1 podman[221660]: 2026-01-27 22:42:18.751605721 +0000 UTC m=+0.058123836 container health_status 33c44453720dc596f879e7dfeb682089a602b2f0775d0aeb01f8237b90bc8083 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 27 22:42:19 compute-1 openstack_network_exporter[195945]: ERROR   22:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Jan 27 22:42:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:42:19 compute-1 openstack_network_exporter[195945]: ERROR   22:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Jan 27 22:42:19 compute-1 openstack_network_exporter[195945]: 
Jan 27 22:42:20 compute-1 nova_compute[183751]: 2026-01-27 22:42:20.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:20 compute-1 nova_compute[183751]: 2026-01-27 22:42:20.285 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:23 compute-1 nova_compute[183751]: 2026-01-27 22:42:23.148 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:23 compute-1 nova_compute[183751]: 2026-01-27 22:42:23.565 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:25 compute-1 nova_compute[183751]: 2026-01-27 22:42:25.287 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:27 compute-1 sshd-session[221684]: Accepted publickey for zuul from 192.168.122.10 port 33476 ssh2: ECDSA SHA256:tJgNt86jXJ03X0b5nwoQJ6QRjM6HNY8aH1v61R57wHc
Jan 27 22:42:27 compute-1 systemd-logind[786]: New session 28 of user zuul.
Jan 27 22:42:27 compute-1 systemd[1]: Started Session 28 of User zuul.
Jan 27 22:42:27 compute-1 sshd-session[221684]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 27 22:42:27 compute-1 sudo[221688]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 27 22:42:27 compute-1 sudo[221688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 27 22:42:28 compute-1 nova_compute[183751]: 2026-01-27 22:42:28.567 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:30 compute-1 nova_compute[183751]: 2026-01-27 22:42:30.289 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:31 compute-1 nova_compute[183751]: 2026-01-27 22:42:31.149 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:31 compute-1 nova_compute[183751]: 2026-01-27 22:42:31.149 183755 DEBUG nova.compute.manager [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Jan 27 22:42:32 compute-1 ovs-vsctl[221858]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 27 22:42:32 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 221712 (sos)
Jan 27 22:42:32 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 27 22:42:32 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 27 22:42:32 compute-1 virtqemud[183420]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 27 22:42:32 compute-1 virtqemud[183420]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 27 22:42:33 compute-1 virtqemud[183420]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 27 22:42:33 compute-1 nova_compute[183751]: 2026-01-27 22:42:33.568 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:34 compute-1 crontab[222267]: (root) LIST (root)
Jan 27 22:42:35 compute-1 nova_compute[183751]: 2026-01-27 22:42:35.291 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:35 compute-1 podman[193064]: time="2026-01-27T22:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 27 22:42:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 15351 "" "Go-http-client/1.1"
Jan 27 22:42:35 compute-1 podman[193064]: @ - - [27/Jan/2026:22:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Jan 27 22:42:36 compute-1 systemd[1]: Starting Hostname Service...
Jan 27 22:42:36 compute-1 systemd[1]: Started Hostname Service.
Jan 27 22:42:36 compute-1 podman[222385]: 2026-01-27 22:42:36.290994944 +0000 UTC m=+0.136465991 container health_status 0cad290046088b2707c6d8143054d7d0636fb2e5c65d156ba6c10c9e9b1b7d08 (image=38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20260126, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 27 22:42:38 compute-1 nova_compute[183751]: 2026-01-27 22:42:38.569 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:40 compute-1 nova_compute[183751]: 2026-01-27 22:42:40.293 183755 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Jan 27 22:42:40 compute-1 nova_compute[183751]: 2026-01-27 22:42:40.651 183755 DEBUG oslo_service.periodic_task [None req-4299baf5-9500-4d5f-a678-78051b88a013 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Jan 27 22:42:40 compute-1 podman[222823]: 2026-01-27 22:42:40.758574287 +0000 UTC m=+0.060356282 container health_status 0d0a4af61437a9c40a849d70276a925ac43fa01d95dfe66b5967084056e4c01d (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-1906ca889f747a7874c38f7dde1e751324374c17012dee147154da58b6efddd8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 27 22:42:40 compute-1 podman[222826]: 2026-01-27 22:42:40.776764656 +0000 UTC m=+0.078486060 container health_status 6020c83b652d75a8fa36bfb5df90ce560476f30b85f9433f54a20db45138896d (image=38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260126, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b635a236d960631fc19c149321360c7a75f041485829f4e3a87e8e0c98b1f960-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-17cb99a33a4e6d417c3dd2a6dcb74eea10a4f7305c7f925ae949e20e59ecdd00-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.195:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
